Modern Computing: A Short History, 1945-2022
Inspired by A New History of Modern Computing by Thomas Haigh and Paul E. Ceruzzi. But the selection of key events in the journey from ENIAC to Tesla, from Data Processing to Big Data, is mine.
April 1945 John von Neumann’s “First Draft of a Report on the EDVAC,” often called the founding document of modern computing, defines “the stored program concept.”
July 1945 Vannevar Bush publishes “As We May Think,” in which he envisions the “Memex,” a memory extension device serving as a large personal repository of information that could be instantly retrieved through associative links.
December 1945 ENIAC, the first electronic, general purpose programmable computer, runs hydrogen bomb calculations. It was developed at the Moore School of Electrical Engineering, University of Pennsylvania, under a U.S. Army contract.
1947 Statistician John W. Tukey coins the term “bit” to designate a binary digit, a unit of information stored in a computer.
September 1947 Association of Computing Machinery (ACM) founded.
November 1947 At Bell Labs, Walter H. Brattain and John A. Bardeen, under the direction of William B. Shockley, discover the transistor effect, developing and demonstrating a point-contact germanium transistor, later leading to small, low-power electronic devices and eventually low-cost integrated circuits.
MORE FOR YOU
December 1948 The Eckert-Mauchly Computer Corporation, the first computer startup, is incorporated (originally formed by Presper Eckert and John Mauchly in 1946 as a partnership, the Electronic Control Company).
1949 Edmund Berkeley publishes Giant Brains: Or Machines That Think in which he writes: “Recently there have been a good deal of news about strange giant machines that can handle information with vast speed and skill… These machines are similar to what a brain would be if it were made of hardware and wire instead of flesh and nerves… A machine can handle information; it can calculate, conclude, and choose; it can perform reasonable operations with information. A machine, therefore, can think.”
1949 The bar code is conceived when 27-year-old Norman Joseph Woodland draws four lines in the sand on a Miami beach. In June 1974, a Universal Product Code (UPC) label was used to ring up purchases at a supermarket for the first time.
March 1951 The U.S. Census Bureau purchases the Univac, developed by the Eckert-Mauchly Computer Corporation (which was acquired by Remington Rand in 1950), establishing the commercial market for computers in the United States. The Univac’s main advantage was the use of tape in place of labor-intensive punch cards processing. By 1954, 20 Univac computers were sold, at around a million dollars each.
November 1951 The Lyons Electronic Office (LEO), the first computer used for business applications, runs “bakery valuations” calculations (computing the costs of ingredients) for J. Lyons and Co., a British restaurant chain, food manufacturing, and hotel conglomerate.
Late 1951 The Whirlwind computer, the first real-time high-speed digital computer using random-access magnetic-core memory, becomes operational and is made available for scientific and military research, later leading to the United States Air Forceʼs Semi-Automatic Ground Environment (SAGE), a continental air-defense system.
January 1952 John Diebold popularizes the term automation (coined at the Ford Motor Company in 1947) with his book Automation: The Advent of the Automatic Factory, based on his research project at the Harvard Business School.
June 1955 The National Security Agency signs a contract with Philco for the development of SOLO, the first general-purpose transistorized computer to operate in the U.S.
August 1955 SHARE, an IBM user group (initially for the IBM 704), is formed, growing to 62 members by the end of the year.
August 1955 The term “artificial intelligence” is coined in a proposal for a summer workshop submitted by John McCarthy (Dartmouth College), Marvin Minsky (Harvard University), Nathaniel Rochester (IBM), and Claude Shannon (Bell Telephone Laboratories). The workshop, which took place in July and August 1956, is generally considered as the official birthdate of the new field.
December 1955 Herbert Simon and Allen Newell develop the Logic Theorist, the first artificial intelligence program, which eventually would prove 38 of the first 52 theorems in Whitehead and Russell’s Principia Mathematica.
June 1956 IBM engineer Werner Buchholz coins the term Byte to describe the smallest unit of information a computer could retrieve and process. IBM later standardized the Byte at 8 bits.
September 1956 IBM announces the 305 RAMAC and the 650 RAMAC (Random Access Memory Accounting) which incorporated the 350 Disk Storage Unit, the first computer storage system based on magnetic disks. It came with fifty 24-inch disks and a total capacity of 5 megabytes, weighed 1 ton, and could be leased for $3,200 per month.
April 1957 IBM introduces the programming language Fortran (from “formula translation”).
1958 An international gathering of computer experts define Algol (for “algorithmic language), later becoming “the seed around which computer science began to crystalize as an academic discipline.”
November 1958 In the Harvard Business Review article “Management in the 1980s,” Harold J. Leavitt and Thomas I. Whisler introduce the term “Information Technology.”
1959 Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductors separately apply for a patent for the invention of the integrated circuit or the microchip.
1959 John McCarthy of MIT suggests the concept of time-sharing, allowing mainframes (and later, minicomputers) to simultaneously support several users. In 1961, McCarthy expanded on this concept to envision a “computer utility” similar to the telephone system.
1959 Arthur Samuel coins the term “machine learning,” reporting on programming a computer “so that it will learn to play a better game of checkers than can be played by the person who wrote the program.”
April 1959 A group of academics, computer users and manufacturers asks the Department of Defense to sponsor the development of a portable programming language that can run on different types of computers. The result is COBOL or ”common business-oriented language.”
November 1959 The Appleton Post-Crescent reports that the Postmaster General is exploring the future possibility of “electronic mail… a letter will cost 15 cents. A nickel to send—and a dime to bribe the electronic brain to forget what it read.”
January 1960 Control Data Corporation delivers the CDC 1604 to the U.S. Navy, establishing the market for supercomputers.
1961 The National Machine Accountants Association (NMAA), established in 1949, is renamed The Data Processing Management Association (DPMA). In 1997, DPMA was renamed the Association of Information Technology Professionals (AITP).
1961 Ole-Johan Dahl and Kristen Nygaard at the Norwegian Computer Center start developing SIMULA, a description language for computer simulations. SIMULA 67, completed in 1967, introduced all the elements essential in object-oriented programming languages such as Smalltalk, C++, Java, and Python.
1961 The first industrial robot, Unimate, starts working on an assembly line in a General Motors plant in New Jersey.
December 1962 Virtual memory, a way to make a computer main memory seems bigger than it is by swapping data with a slower but larger storage medium, is introduced with the Atlas Computer, designed at the University of Manchester and built by Ferranti Ltd.
1963 The Institute of Radio Engineers and the American Institute of Electrical Engineers merge to form the Institute of Electrical and Electronics Engineers (IEEE).
1963 The first database management system, the Integrated Data Store (IDS), is operational on a test basis.
1964 John G. Kemeny and Thomas E. Kurtz of Dartmouth College develop BASIC (Beginners’ All-purpose Symbolic Instruction Code), a high-level programming language designed for ease of use and widespread adoption by users of time-sharing and interactive computing systems.
April 1964 IBM announces the System 360 family of computers, all using the same instruction set, facilitating IBM’s customers upgrade path to more powerful computers.
1965 Edward Feigenbaum, Bruce G. Buchanan, Joshua Lederberg, and Carl Djerassi start working on DENDRAL at Stanford University. The first expert system, it automates the decision-making process and problem-solving behavior of organic chemists, with the general aim of studying hypothesis formation and constructing models of empirical induction in science.
March 1965 Digital Equipment Corporation (DEC) introduces the PDP-8, creating the market for minicomputers.
1966 The Artificial Intelligence Center of the Stanford Research Institute (SRI) starts developing SHAKEY, the first mobile intelligent robot.
February 1966 Robert Taylor becomes the director of the Information Processing Techniques Office (IPTO) at the U.S. Defense Department’s Advanced Research Projects Agency (ARPA). He proposes to his boss the ARPAnet, a network that will connect the different projects at different universities that ARPA was sponsoring. At the time, each project had its own specialized terminal, computer system and unique set of user commands.
October 1967 At the first ACM Symposium on Operating Systems Principles, Larry Roberts presents “Multiple computer networks and intercomputer communication,” in which he describes the architecture of the “ARPA net” and argues that giving scientists the ability to explore data and programs residing in remote locations will reduce duplication of effort and result in enormous savings.
1968 The NATO Conference on Software Engineering, an international meeting of software developers, attempts to define and establish standards for how software should be developed.
1968 J.C.R. Licklider predicts in “The Computer as a Communications Device” that “In a few years, men will be able to communicate more effectively through a machine than face to face.”
August 1968 Donald Davies at the UK’s National Physical Laboratory (NPL) demonstrates publicly for the first time a prototype packet-switching network.
December 1968 Douglas Engelbart demonstrates interactive computer programs controlled by a mouse and connected via a live microwave link to a remote computer in a presentation to the Fall Joint Computer Conference. It became known as “the mother of all demos.”
1969 Several Bell Labs engineers, led by Ken Thompson and Dennis Ritchie, start development of what will become the Unix operating system.
1969 Willard Boyle and George E. Smith at Bell Labs invent the charge-coupled device (CCD). Fairchild Semiconductors was one the first companies to commercialize the invention, introducing in 1974 a light sensor which led to the development of Kodak’s digital camera.
1969 CompuServe, the first major commercial online service in the U.S., is established as a subsidiary of Golden United Life Insurance, renting time on its PDP-10 to business customers. Ten years later, it began offering a dial-up online information service to consumers.
April 1969 Steve Crocker submits RFC 1, the first “Request for Comment” which became the primary mechanism for the collaborative and open development of the Internet.
October 1969 The first message (“Login”) is sent over the ARPANET between the network node at UCLA and a second one at SRI. By the end of the year, four host computers were connected together into the initial ARPANET.
1970 IBM’s Edgar F. Codd publishes “A Relational Model of Data for Large Shared Databases.” Relational databases will become the dominant approach to data management by the end of the 1980s.
1971 Bob Thomas at BBN creates the first computer virus, an experimental self-replicating program called Creeper which copied itself to computers connected to the ARPANET and displayed the message “I’m the creeper, catch me if you can!”
July 4, 1971 Michael Hart launches Project Gutenberg with the goal of making copyright-free works electronically available by entering the text of the U.S. Declaration of Independence into the University of Illinois mainframe he was using, creating a 5K file.
November 1971 Intel advertises the 4004 as “a microprogrammable computer on a chip.”
Late 1971 Ray Tomlinson at BBN writes the code for network email and sends the first email over the ARPANET.
1972 Hewlett-Packard introduces the HP-35, the first handheld calculator to perform transcendental functions (such as trigonometric, logarithmic and exponential functions). Two years later, it introduced the HP-65, the first programmable pocket calculator. The HP-35 and subsequent models have replaced the slide rule, used by generations of engineers and scientists.
1972 DIALOG, the first interactive, online search system, providing access to large text-based databases while allowing iterative refinement of results, is offered commercially.
November 1972 Atari announces Pong which became the first popular video arcade game.
1973 Charles Bachman, developer of IDS [see 1963], delivers his Turing-Award lecture “The Programmer as Navigator,” arguing for “a shift from a computer-centered to the database-centered point of view,” a Copernican revolution driven by database management systems.
January 1973 Mario Cardullo receives the first patent for a passive, read-write Radio Frequency Identification (RFID) tag. Presented to the New York Port Authority in 1971, it consisted of a transponder with 16 bit memory for use as a toll device.
March 1973 Xerox PARC introduces the Alto, the first modern personal computer, supporting a graphical user interface.
May 1973 Bob Metcalfe invents Ethernet and later, with David Boggs, implements it in Xerox PARC. In 1980, as co-founder and CEO of 3Com, Metcalfe convinced DEC, Intel, and Xerox to work together to promote Ethernet as a standard for Local Area Networks (LANs).
November 1973 Unix Version 4 is released, rewritten in the programming language C, developed by Dennis Ritchie.
July 1974 Radio-Electronics features on its cover a kit computer with the tag “Build the Mark-8: Your personal Minicomputer.”
1975 IBM’s John Cocke introduces the idea of reduced instruction set computer, leading to the development of the IBM 801 which demonstrated that a simple design outperforms even the most powerful (and complex) classic CPU designs. The concept was taken further by David Patterson at Berkley with the 1980 RISC project and John Hennessy at Stanford with the 1981 MIPS project.
January 1975 Popular Electronics features on its cover the Altair 8800, calling it the “world’s first minicomputer kit.” The cover draws the attention of Paul Allen and Bill Gates who, with Monte Davidoff, develop a BASIC interpreter for the Altair, later used as both programming language and user interface for most early personal computers.
1976 The Cray 1 is installed at the Los Alamos National Laboratory. It remained the world’s fastest computer until 1982 when it was succeeded by the Cray X-MP.
1976 Will Crowther develops Adventure (Colossal Cave Adventure), a text-based adventure game, on the PDP-10 at BBN. It became widely popular, greatly influenced future video games, and has been ported to numerous computer systems. In 2017, Eric S. Raymond created a port for modern computers of Don Woods’ 1995 version of the game as Open Adventure.
January 1976 Ray Kurzweil introduces a reading machine for the blind, combining a flatbed scanner and a text-to-speech synthesizer.
June 1976 Wang Laboratories introduces the 1200 Wang Word Processing System (WPS), an easy to use, multiuser network of terminals, each incorporating the Intel 8080 microprocessor and 64 KB of RAM, all sharing a central disk storage. It is an instant success and Wang Labs was ranked 8th in data processing revenues in 1983, up from the 45th place in 1976.
1977 Commodore, Apple, and Tandy begin to produce fully functional and affordable personal computers, expanding the market beyond electronics hobbyists.
February 1978 Ward Christensen and Randy Suess launch the Computerized Bulletin Board System (CBBS), the first Bulletin Board System (BBS).
June 1978 Texas Instruments introduces Speak & Spell, one of the earliest handheld electronic devices with a display. It is based on a new chip, the TMS5100 which minituarized all the circuits needed to synthesize speech.
1978 Japan’s Taito releases Space Invaders, a video arcade game developed by Tomohiro Nishikado, one of the most influential video games of all time.
1979 The Stanford Cart successfully crosses a chair-filled room without human intervention in about five hours, becoming one of the earliest examples of an autonomous vehicle.
October 1979 VisiCalc is released, the first spreadsheet program for personal computers. The Apple II’s “killer app,” it was developed by Dan Bricklin and Bob Frankston.
July 1980 Minitel, a videotex online service offered by France Telecom, is tested with 55 residential and business telephone customers. At its peak in mid-1993, almost six and a half million terminals were used for ninety million connection hours.
1981 Apollo Computer introduces the first graphics workstation, based on the Motorola 68000 microprocessor, later joined primarily by competitors Sun Microsystems and Silicon Graphics.
April 1981 The first successful portable computer, the Osborne 1, is released. Weighing 24.5 pounds, its $1,795 purchase price included WordStar and SuperCalc spreadsheet program.
August 1981 The IBM PC is released. Byte magazine describes it as “a synthesis of the best the microcomputer industry has offered to date.” It incorporates the Intel 8088, one or two 5.25″ floppy disk drives, each with a capacity of up to 320KB; PC DOS, developed by Microsoft, MicroPro’s WordStar, and Ashton-Tate’s dBase II.
October 1981 Dave Smith and Chet Wood propose what will become known as MIDI (musical instrument digital interface), a technical standard for connecting a wide variety of electronic musical instruments, computers, and related audio devices for playing, editing, and recording music.
October 1982 The first audio CD player, the Sony CDP-101, is released in Japan.
September 1983 A Louis Harris & Associates survey finds that 10% of U.S. adults have a home computer and, of those, 14% use a modem to send and receive information.
October 1983 Acorn Computers engineers start development of the Acorn RISC Machine (ARM) which is today’s most widely used processor architecture, powering the vast majority of smartphones and tablets.
January 1984 Apple Computer’s Steve Jobs introduces the Macintosh, the first desktop personal computer to feature graphical user interface, built-in screen and a mouse. And the Mac said: “Never trust a computer you cannot lift.”
February 1985 Whole Earth’s ‘Lectronic Link (WELL) is established, one of the first “virtual communities.” The WELL presented its first users with the disclaimer You Own Your Own Words. YOYOW strived to achieve the goal of attracting interesting people into online conversation with each other, while giving them responsibility for their own words and ideas.
March 1985 Denon and Sony introduce the CD-ROM at the COMDEX show in Japan.
May 1985 Quantum Computer Services, an online services company, is launched, offering Quantum Link, a dedicated online service for Commodore computers. It will later evolve into America Online (AOL), the most popular online service in the early 1990s.
July 1985 Aldus PageMaker launches the desktop publishing market.
1986 First driverless car, a Mercedes-Benz van equipped with cameras and sensors, built at Bundeswehr University in Munich under the direction of Ernst Dickmanns, drives up to 55 mph on empty streets.
December 1987 Toshiba introduces NAND Flash Memory at the IEEE 1987 International Electron Devices Meeting. Retaining data when power is turned off, it can be electrically erased and reprogrammed. In the following decades, it has been key to the successful commercialization of new portable devices and eventually replaced hard disk storage in most personal computers.
1988 David Patterson, Garth A. Gibson, and Randy Katz at the University of California, Berkeley, argue in “A Case for Redundant Arrays of Inexpensive Disks (RAID)” that an array of inexpensive personal computer disk drives could outperform mainframe disk drives. Although failures would rise in proportion to the number of inexpensive drives, by configuring for redundancy, the reliability of an array could far exceed that of any large single drive.
1988 Members of the IBM T.J. Watson Research Center publish “A statistical approach to language translation,” heralding the shift from rule-based to probabilistic methods of machine translation, and reflecting a broader shift to “machine learning” based on statistical analysis of known examples, not comprehension and “understanding” of the task at hand.
October 1988 Dave Cutler and eleven other DEC engineers join Microsoft after their RISC-related project is killed. They develop Windows NT (launched in 1993) which has served as the core of every new version of Windows since 2001.
March 1989 Tim Berners-Lee circulates “Information management: A proposal” at CERN, outlining a global hypertext system which in December 1990 he launched as the World Wide Web.
1990 The U.S. Bureau of Labor Statistics estimates that 15% of U.S. households own a computer.
May 1990 Microsoft launches Windows 3.0, making graphical user interface an integral component of mainstream personal computing.
July 1990 The Gartner Group coins the term “enterprise resource planning” (ERP) in a discussion of material resource planning (MRP). It came to describe in the 1990s a suite of integrated applications, such as SAP R/3 (introduced in 1992), that enterprises used to collect, store, manage, and interpret data from many business functions and activities.
September 1991 Xerox PARC’s Mark Weiser publishes “The Computer in the 21st Century” in Scientific American, using the terms “ubiquitous computing” and “embodied virtuality” to describe his vision of how “specialized elements of hardware and software, connected by wires, radio waves and infrared, will be so ubiquitous that no one will notice their presence.”
September 1991 22-year-old Linus Torvalds posts Linux online, an operating system kernel which later evolved into a family of open-source Unix-like operating systems.
January 1993 Marc Andreessen announces version 0.5 of NCSA X Mosaic Web browser which he developed with Eric Bina at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign. Andreesen will go on to co-found Mosaic Communications (later Netscape Communications) which released the first version of the Netscape Navigator browser in November 1994.
February 1993 The University of Minnesota decides it would charge a license fee for certain classes of Gopher users, effectively eliminating a key competitor to the World Wide Web.
April 1993 CERN declares the Web protocol and code free to all users.
August 1993 Apple launches the Newton, a “personal digital assistant.”
November 1993 The video camera monitoring the Trojan Coffee Pot at the University of Cambridge’s Computer Laboratory is connected to the Web, becoming the first Webcam. What before entertained a few locally connected people becomes a world-wide show with 1 million hits by 1996.
1994 Steve Mann develops a wearable wireless Webcam, considered the first example of lifelogging.
October 1994 HotWired is the first website to sell banner ads in large quantities to a wide range of major corporate advertisers.
May 1995 Sun releases Java, a programming language intended to let programmers “write once, run everywhere.” It was originally developed by James Gosling and others at Sun to allow interactive applications to be downloaded to digital cable television boxes. It became widely popular when Netscape used it to allow Web page designers to add animation, movement, and interactivity to their pages.
August 9, 1995 Netscape share price soars to $75 during its first day of trading, up from the offering price of $28. The Netscape IPO has been referred to in the media as the birth of the Web or even “the Internet.” It was certainly the birth of what became to be known as the “dot-com bubble.”
October 1995 The Pew Research Center finds that 14% of U.S. adults are now online, most using dial-up modem connections, but only 3% of online users have ever signed on to the World Wide Web. 42% of U.S. adults had never heard of the Internet and an additional 21% knew it had something to do with computers.
December 1995 MIT’s Nicholas Negroponte and Neil Gershenfeld write in “Wearable Computing” in Wired: “For hardware and software to comfortably follow you around, they must merge into softwear… The difference in time between loony ideas and shipped products is shrinking so fast that it’s now, oh, about a week.”
November 1996 The digital video disc (DVD) format, an extension of CD technology, is launched in Japan with the first major releases from Warner Home Video arriving a month later. DVD players became the fastest-adopted consumer devices in American history.
1998 The first Google index has 26 million Web pages. It reaches one billion in 2000 and one trillion in 2008.
1998 The first working digital video recorder (DVR) prototype is developed at Stanford University Computer Science department. Consumer digital video recorders ReplayTV and TiVo were launched the next year.
May 1999 VMware delivers its first product, VMware Workstation. Its product line supporting virtual machines on Intel chips-based servers later played a key role in the proliferation of cloud computing.
August 2000 According to the U.S. Census Bureau, 51% of U.S. households have one or more computers, up from 8.2% in 1984 and 22.8% in 1993. 41.5% of households have access to the Internet, up from 18% in 1997.
October 2001 Apple introduces the iPod or “a thousand songs in your pocket.” The first pocket-sized music players (e.g., Diamond Rio) were introduced a year earlier. 110 million iPods have been sold by 2007.
April 2003 Apple opens its digital media store, the iTune Store, becoming two years later the world’s largest music retailer.
2004 Jeffrey Dean and Sanjay Ghemawat publish “MapReduce: Simplified Data Processing on Large Clusters,” describing Google’s programming model for processing and generating big data sets with a parallel, distributed algorithm running on a cluster of computers.
January 2007 Apple introduces the iPhone and changes its name from “Apple Computer, Inc.” to “Apple Inc.”
March 2007 Estonia becomes the world’s first country to use internet voting in a parliamentary election.
September 2008 The first smartphone running Google’s Android operating system, the HTC Dream, is announced.
December 2008 Randal E. Bryant, Randy H. Katz, and Edward D. Lazowska publish “Big-Data Computing: Creating Revolutionary Breakthroughs in Commerce, Science and Society,” arguing that “Big-data computing is perhaps the biggest innovation in computing in the last decade.”
2009 Google starts developing, in secret, a driverless car. In 2014, it became the first to pass, in Nevada, a U.S. state self-driving test.
June 2009 The U.S. completes the changeover from analog to digital television.
February 2010 Apple adds to the iPhone a voice-recognition application, Siri. Over the next few years, similar “voice assistants” were released by Amazon (Alexa), Google (Google Asistant), and Microsoft (Cortana).
February 2010 Kenneth Cukier writes in The Economist Special Report ”Data, Data Everywhere“: ”… a new kind of professional has emerged, the data scientist, who combines the skills of software programmer, statistician and storyteller/artist to extract the nuggets of gold hidden under mountains of data.”
February 2011 Martin Hilbert and Priscila Lopez publish “The World’s Technological Capacity to Store, Communicate, and Compute Information” in Science. They estimate that in 1986, 99.2% of all storage capacity was analog, but in 2007, 94% of storage capacity was digital, a complete reversal of roles (in 2002, digital information storage surpassed non-digital for the first time).
October 2012 A convolutional neural network (popularly known as “artificial intelligence” in the following years) designed by researchers at the University of Toronto achieve an error rate of only 16% in the ImageNet Large Scale Visual Recognition Challenge, a significant improvement over the 25% error rate achieved by the best entry the year before.
December 2012 Annual e-commerce sales top $1 trillion worldwide for the first time.
March 2016 Google DeepMind’s AlphaGo defeats Go champion Lee Sedol.
June 2018 OpenAI publishes “Improving Language Understanding by Generative Pre-Training,” introducing the new Generative Pre-trained Transformer (GPT) approach to natural language processing, based on “semi-supervised” learning.
July 2020 OpenAI releases GPT-3. Based on 175 billion machine learning parameters, it produces “human-like” text.
December 2021 The automotive industry accounts for around 15% of the global semiconductor market, with up to 3,000 chips in a single car. It is estimated that because of the chip shortage automakers worldwide could not produce about 11 million cars they planned to make, costing the global auto industry about $210 billion in lost revenue in 2021.
April 2022 63% of the world’s total population or 5 billion people use the Internet, up by 200 million over the last year. 92.4% of Internet users use a mobile phone to go online at least some of the time and mobile phones account for more than half of the world’s Web traffic.