How the Internet Happened Read online




  HOW THE

  INTERNET

  HAPPENED

  FROM NETSCAPE TO THE IPHONE

  BRIAN

  McCULLOUGH

  LIVERIGHT PUBLISHING CORPORATION

  A Division of W. W. NORTON & COMPANY

  Independent Publishers Since 1923

  NEW YORK LONDON

  For:

  My Dad, who taught me to love history And for Werner Stocker, who taught me that history was cool

  “Then there is electricity!—the demon, the angel, the mighty physical power, the all-pervading intelligence!” exclaimed Clifford. “Is that a humbug, too? Is it a fact—or have I dreamt it—that, by means of electricity, the world of matter has become a great nerve, vibrating thousands of miles in a breathless point of time? Rather, the round globe is a vast head, a brain, instinct with intelligence! Or, shall we say, it is itself a thought, nothing but thought, and no longer the substance which we deemed it!”

  —THE HOUSE OF THE SEVEN GABLES,

  CHAPTER 17,

  “THE FLIGHT OF TWO OWLS,”

  NATHANIEL HAWTHORNE

  ■

  I think one of the things that really separates us from the high primates is that we’re tool builders. I read a study that measured the efficiency of locomotion for various species on the planet. The condor used the least energy to move a kilometer. And, humans came in with a rather unimpressive showing, about a third of the way down the list. It was not too proud a showing for the crown of creation. So, that didn’t look so good. But, then somebody at Scientific American had the insight to test the efficiency of locomotion for a man on a bicycle. And, a man on a bicycle, a human on a bicycle, blew the condor away, completely off the top of the charts.

  And that’s what a computer is to me. What a computer is to me is it’s the most remarkable tool that we’ve ever come up with, and it’s the equivalent of a bicycle for our minds.

  —STEVE JOBS

  1990 INTERVIEW FOR THE FILM

  MEMORY & IMAGINATION

  CONTENTS

  INTRO

  1 THE BIG BANG ■ The Mosaic Web Browser and Netscape

  2 BILL GATES “GETS” THE INTERNET ■ Microsoft and Internet Explorer

  3 AMERICA, ONLINE ■ AOL and the Early Online Services

  4 BIG MEDIA’S BIG WEB ADVENTURE ■ Pathfinder, HotWired and Ads

  5 HELLO, WORLD ■ The Early Search Engines and Yahoo

  6 GET BIG FAST ■ Amazon.com and the Birth of Ecommerce

  7 TRUSTING STRANGERS ■ eBay, Community Sites and Portals

  8 BLOWING BUBBLES ■ The Dot-com Era

  9 IRRATIONAL EXUBERANCE ■ The Dot-com Bubble

  10 POP! ■ Netscape vs. Microsoft, AOL + Time Warner and the Nuclear Winter

  11 I’M FEELING LUCKY ■ Google, Napster and the Rebirth

  12 RIP. MIX. BURN. ■ The iPod, iTunes and Netflix

  13 A THOUSAND FLOWERS, BLOOMING ■ PayPal, AdWords, Google’s IPO and Blogs

  14 WEB 2.0 ■ Wikipedia, YouTube and the Wisdom of Crowds

  15 The Social Network ■ Facebook

  16 THE RISE OF MOBILE ■ Palm, BlackBerry and Smartphones

  17 ONE MORE THING ■ The iPhone

  OUTRO

  ACKNOWLEDGMENTS

  NOTES

  INDEX

  HOW THE INTERNET HAPPENED

  INTRO

  When computers were first developed in the 1940s and ’50s, it was never imagined that the common man or woman would ever need, much less have use for, them. Computers were designed for big problems: calculating missile trajectories; putting a man on the moon. Legend holds that the founder of IBM, Thomas J. Watson, once remarked, “I think there is a world market for maybe five computers.” This quote is probably apocryphal, but it does capture the early thinking when computers first began to serve man. They were to be rare, expensive oracles; and like the oracles of ancient times, they would be useful only in rare, exceptional cases.

  Computers were expensive in the beginning. They were complicated and difficult and they were as big as a room (that is not hyperbolic phrasing; generally considered the first modern computer, the ENIAC occupied about 1,800 square feet and weighed almost 50 tons). The received wisdom of their rarefied utility affected their design; computers were not conceived to be user-friendly because it was never assumed a nonexpert user would interact with one. Histories of early computers talk about a “priesthood” of computer specialists who actually interacted with the machines. Say you had a mathematical or engineering problem to solve. You submitted your punch cards to the “priests,” and they used the computer to tease out the answer. Even when computers began to infiltrate the workplace in the 1960s through the 1980s (much to the surprise of the computer industry itself), it was still assumed that an “average” computer user could only achieve competency in limited, specific tasks or programs. The greater wrangling or mastery of the system at large was left to the early progenitors of what would become known as “the IT guy.”

  And yet, the tantalizing, almost forbidden mystique of computers seduced a generation of what were considered hobbyists in the 1970s. The hobbyists wanted to master computers themselves. They wanted computers that responded to them directly, without intermediaries. They wanted personal computing. And so, they made it happen. Steve Jobs, Steve Wozniak, Bill Gates, the Homebrew Computer Club—the hobbyists created the personal computer category (originally, they were called microcomputers) and thus, the PC Industry.

  That still wasn’t quite enough to make computers friendly to the average person. Almost a decade into the PC era, the industry remained trapped in the paradigm of the “command line.” If you sat in front of a computer, you would see a blinking cursor and would need to type something to make the machine do anything for you. What would you need to type? Well, see, that’s the point: it was functional inscrutability that continued to make computers so abstruse. In the era of the command line, you almost needed to have read the manual cover-to-cover or have previously mastered a computer language even to use the damn things. You had to know how to use a computer before you could use a computer.

  This problem was solved by the invention of the GUI, or graphical user interface. Computers were humanized by graphics, by colors, by friendly icons and drop-down menus and a cute little tool called a mouse. Now when you sat in front of a computer, you could grab the mouse and just—click. You didn’t have to know anything beforehand. You could learn how to use the machine by using it. Invented by Xerox, popularized by Apple Computers and the Macintosh, and then mainstreamed by Microsoft and its Windows operating system, the GUI was the evolutionary leap that would eventually make computers friendly to the average user.

  But even when computers began to enter our everyday lives, our offices, our homes, they still were a bit esoteric. You might use a word processor at your job. Your kids might play computer games in your basement. But you didn’t really need or use a computer in your everyday life. By 1990, only 42% of U.S. adults said they used a computer even “rarely.”1 In that same year, the number of American households that owned a computer had not yet passed 20%.2

  ■

  THE INTERNET, AND ESPECIALLY the World Wide Web, finally brought computers into the mainstream. The Internet is the reason that computers actually became useful for the average person. The Internet is the thing that made a computer something you check in with daily, even hourly. And that is what this book is about: how the web and the Internet allowed computers to infiltrate our everyday lives. This is not a history of the Internet itself, but rather, a history of the Internet Era, that period of time from roughly 1993 through 2008 when computers and technology itself stopped being esoteric and started becoming vital and indis
pensable. It is about great technologies and disruptions and entrepreneurs. It is about how we allowed these technologies into our lives, and how these technologies changed us.

  ■

  LIKE COMPUTERS, THE INTERNET was not designed with you and me in mind.

  Computers were first hooked together in a meaningful way in 1969. This was the ARPANET, the grandfather of the Internet, and (mostly) true to legend, it was birthed by a Cold War–era alliance of the United States military and the academic-industrial complex. Funded by DARPA, the Defense Advanced Research Projects Agency, the Internet’s first four connections, or “nodes,” were all at academic research centers: the University of California, Los Angeles; the Stanford Research Institute; the University of California, Santa Barbara; and the University of Utah.

  The ARPANET was a blue-sky research project that, ostensibly, would allow for greater (and more resilient) communications among decision-makers during a nuclear strike. While they sold their project to the generals this way, the academics behind the ARPANET then spent the next twenty years evolving the network into a system that better suited their own needs: a distributed, nonhierarchical computer and communications network that facilitated discussion and exchange among the research and scientific community. The ARPANET evolved into the Internet we recognize today not as a populist or mass-market communications system, but as an electronic playground where a priesthood of academics could play and exchange ideas.

  This elitist focus showed in the Internet’s maturity. All of the various Internet protocols that developed, from the most obvious, such as FTP (File Transfer Protocol) and TCP/IP (the basic building block of the Internet itself), to the most recent and seemingly sophisticated, such as newsgroups, Gopher (the first true Internet search) and even email—all were complex to set up, unfriendly to nonexpert users, and, frankly, boring and utilitarian. Even as computers became personal, even as computing itself was becoming colorful and democratized, the Internet remained stubbornly aloof, sequestered in the ivory towers of academia.

  The Internet, in short, needed its own GUI revolution, that application/user interface innovation that would make the Internet user-friendly just as the graphical user interface had done with computing itself. The World Wide Web arrived just at the right time, and provided this exact paradigm shift just when it was needed.

  The web came in 1990, just as Windows was beginning to take computers into the majority of the world’s homes and offices, and just as the computer mouse and the graphical icon were making computing point-and-click intuitive for everyday people. The web lived in this world. You navigated the web with a mouse, you clicked on links, and the whole thing moved with the innate, logical simplicity of how human thought seems to work: jumping from one idea or association to another, flowing backward and forward in the direction of idea and inspiration, reference and retort. The web took the fundamental concept of the Internet (connecting computers together) and made it manifest through the genius of the hyperlink. One website linked to another. One idea linked to another. This metaphor of the link made the whole conceptual idea of the Internet, of linking computers together, of linking people’s minds together, of linking human thought together, finally and wonderfully real.

  And yet, the web itself was still a child of academia. It remained a researcher’s dream of a scholarly utopia. It is well known that Tim Berners-Lee invented the web while he was employed at CERN, the great multinational scientific research institution in Switzerland. As the Internet was born in the midst of a great scientific effort to win the Cold War, the web was born in the midst of a great scientific effort to reveal the secrets of the Big Bang.

  Berners-Lee saw his new Internet protocol as an improvement on top of the existing structure of the Internet itself. He built the web upon previous conceptual and philosophical notions (hypertext, cyberspace, collaboration) to create what was really a new medium. In his Usenet post announcing the web, Berners-Lee declared, “The WWW project merges the techniques of information retrieval and hypertext to make an easy but powerful global information system.”3 But he still envisioned it—at heart—as a research medium, a way for the hundreds of CERN scientists from all around the world to share their data, disseminate their ideas, and collaborate on research.

  Again, from his announcement post:

  The WWW project was started to allow high energy physicists to share data, news, and documentation. We are very interested in spreading the web to other areas, and having gateway servers for other data. Collaborators welcome!

  The collaborators Berners-Lee was calling for were imagined as fellow researchers and academics. The web, for all the structural ways that it would eventually prove friendly to the average computer user, was still intended for the priesthood, not the masses.

  There was one, final, catalyzing event that had to happen before the web—and with it, the Internet on the whole—could go mainstream. There was one more innovation necessary before the average user would be invited to join the computer revolution en masse, and we could create a world with Amazon and smart televisions and app stores and self-driving cars and cat memes.

  That one more thing would, in fact, come from a research institution, but it would serve to wrest the Internet and computers themselves from the privileged clutches of academia forever and thrust them into the loving embrace (and eventually the pockets) of average users like you and me.

  1

  THE BIG BANG

  The Mosaic Web Browser and Net­scape

  Net­scape Communications Corporation held an initial public offering, or IPO, on August 9, 1995. Net­scape shares were originally to be priced at $14 per share, but at the last minute the price was lifted to $28 per share. When the markets opened at 9:30 A.M. Eastern Time, Net­scape’s stock did not open with it. Buyer demand was so great that an orderly market could not immediately be made. Interest from individual investors was so overwhelming that callers to the retail investment firm Charles Schwab were greeted by a recording that said: “Welcome to Charles Schwab. If you’re interested in the Net­scape IPO, press one.” At Morgan Stanley, one retail investor offered to mortgage her home and put the proceeds into Net­scape stock. The first Net­scape trade did not hit the ticker until around 11 A.M. The price of that first trade was $71, almost triple the offer price.

  Over the course of the day Net­scape, with the ticker symbol NSCP, reached $75 before ending the day at a respectable $58.25. Net­scape had only existed as a corporation for sixteen months. Since its inception, it had generated revenues of only $17 million. It had nothing in the way of profits on its balance sheet. But at the end of trading that day, the stock market valued the company at $2.1 billion.

  These days we’re used to embryonic technology companies debuting on the stock market to soaring valuations, but in August of 1995, such an event was almost unheard of. The financial press was in awe, if skeptical. On its front page the next day, the Wall Street Journal said, “It took General Dynamics Corp. 43 years to become a corporation worth $2.7 billion. . . . It took Net­scape Communications Corp. about a minute.”1 Plenty of commentators were shocked that a company that had yet to make any sustained profit could be valued so highly. Still others were puzzling over what this “Internet” thing even was, and why it was making people rich. As August 9 also happened to be the day that Jerry Garcia of the Grateful Dead died, a joke made the rounds on Wall Street: What were Jerry Garcia’s last words? Answer: “Net­scape opened at what?”

  A lot of the chatter was about the sudden, unprecedented and remarkable creation of wealth. Cofounder Jim Clark’s 20% stake in the company was worth $663 million on the day of the IPO. Early Net­scape employees were worth millions of dollars (on paper at least), including the company’s baby-faced, twenty-four-year-old cofounder, only a few months out of college, who was suddenly worth $58 million.

  A few short months later, in December 1995, Net­scape’s stock price would hit $171 a share, more than six times the price at the IPO. A few weeks after t
his milestone, that same twenty-four-year-old, Marc Andreessen, would grace the cover of Time magazine.

  There are occasionally events that signal the arrival of a new force in culture (say, the Beatles on The Ed Sullivan Show) or serve as the demarcation line between historical eras (September 11, 2001, for example). The Net­scape IPO was just such a moment in time. Today, young twenty-somethings dream of coding their way to billion-dollar fortunes. Today, the phone in your pocket is more powerful than every computer involved in the moon landing. Today, it’s possible to know, in real time, what your high school crush had for lunch. Net­scape set the groundwork for this reality. The Net­scape IPO was the big bang that started the Internet Era. That picture of a barefoot Marc Andreessen on the cover of Time was what started young geeks dreaming of Silicon Valley riches. Net­scape would not define the Internet Era—or even survive it—but it was the first of its kind, and in many ways it was the template for all the people and companies that would follow.

  ■

  THE MODERN WEB ERA began in Champaign, Illinois. The University of Illinois at Urbana-Champaign is world-famous as a leading research institution in the field of computing. The ORDVAC and ILLIAC, two of the earliest computers in the world, were built there in 1951; the university was granted Unix license number one by Bell Laboratories in 1975; and in 1985, the National Center for Supercomputing Applications (NCSA) was established there. In the famous science fiction movie 2001: A Space Odyssey, the homicidal HAL 9000 computer states that he “became operational” in Urbana, Illinois, on January 12, 1992, partially as a nod to the university’s prominence in the field.

  When the National Science Foundation took over the operations of the Internet in the 1980s, the University of Illinois was a key part of the Internet “backbone,” that superstructure of digital pipes that allowed the network to function.2 By 1992, when the superfast T3 network was launched as the successor backbone for the Internet, the NCSA and the university were sitting on some of the fastest computer connections in the world. In other words, by the early 1990s, there wasn’t a better place in the world if you wanted to be swept up in the revolution of the World Wide Web.