Closing the Digital Frontier

The era of the Web browser’s dominance is coming to a close. And the Internet’s founding ideology—that information wants to be free, and that attempts to constrain it are not only hopeless but immoral— suddenly seems naive and stale in the new age of apps, smart phones, and pricing plans. What will this mean for the future of the media—and of the Web itself?

As Chris Anderson pointed out in a moment of non-hyperbole in his book Free, the phrase Information wants to be free was never meant to be the rallying cry it turned into. It was first uttered by Stewart Brand at a hacker conference in 1984, and it came with a significant disclaimer: that information also wants to be expensive, because it can be so important (see “Information Wants to Be Paid For,” in this issue). With the long tail of Brand’s dictum chopped off, the phrase Information wants to be free—dissected, debated, reconstituted as a global democratic rallying cry against monsters of the political, business, and media elites—became perhaps the most powerful meme of the past quarter century; so powerful, in fact, that multibillion-dollar corporations destroyed their own businesses at its altar.

It’s a bit of a Schrödinger’s-cat situation when you try to determine what would have happened if we had not bought into the IWTBF mantra, but by the time digital culture exploded into the mainstream with the introduction first of the Mosaic browser and then of Netscape Navigator and Internet Explorer, in the mid-’90s, free was already an idea only the very old or very obtuse dared to contradict. As far back as the mid-’80s, digital freedom was a cause célèbre on the Northern California–based Whole Earth ’Lectronic Link (known as the WELL), the wildly influential bulletin-board service that brought together mostly West Coast cyberspace pioneers to discuss matters of the day.

It gives you a feel for the WELL’s gestalt to know that Brand, who founded the WELL, was also behind the Long Now Foundation, which promotes the idea of a consciousness-expanding 10,000-year clock. Thrilling, intense, uncompromising, at times borderline self-parodically Talmudic, the WELL had roots in the same peculiar convergence of hippiedom and techno-savantism that created Silicon Valley, but it also called out, consciously and un-, to a neo-Jeffersonian idea of the digital pioneer as a kind of virtual sodbuster. The WELL-ite Howard Rheingold, in his 1993 digital manifesto, The Virtual Community: Homesteading on the Electronic Frontier, described himself as being “colonized” (in a good way) by his virtual community. The libertarian activist John Perry Barlow, an early member of the WELL’s board of directors, was a co-founder of the Electronic Frontier Foundation, a digital version of the ACLU.

At the WELL, the core gospel of an open Web was upheld with such rigor that when one of its more prolific members, Time magazine’s Philip Elmer-DeWitt, published a scare-the-old-folks cover story on cyber porn in 1995, which carried the implication that some measure of online censorship might not be a bad thing, he and his apostasy were torn to pieces by his fellow WELL-ites with breathtaking relentlessness. At the time, the episode was notable for being one of the first examples of the Web’s ability to fact-check, and keep in check, the mainstream media—it turned out that the study on which Time’s exclusive report was based was inaccurate, and its results were wildly overstated. In retrospect, what seems notable is the fervor with which digital correctness—the idea that the unencumbered flow of everything, including porn, must be defended—was being enforced. In the WELL’s hierarchy of values, pure freedom was an immutable principle, even if the underlying truth (that porn of all kinds was and would be increasingly ubiquitous on the Web, with actual real-life consequences) was ugly and incontestable.

Digital freedom, of the monetary and First Amendment varieties, may in retrospect have become our era’s version of Manifest Destiny, our Turner thesis. Embracing digital freedom was an exaltation, a kind of noble calling. In a smart essay in the journal Fast Capitalism in 2005, Jack Shuler shows how similar the rhetoric of the 1990s digital frontier was to that of the 19th-century frontier era. It’s a short jump from John L. O’Sullivan in 1839—“The far-reaching, the boundless will be the era of American greatness. In its magnificent domain of space and time, the nation of many nations is destined to manifest to mankind the excellence of divine principles”—to Kevin Kelly, the pioneering conceptualizer of the “hive mind” and a founding editor of Wired, writing in Harper’s in 1994, “A recurring vision swirls in the shared mind of the Net, a vision that nearly every member glimpses, if only momentarily: of wiring human and artificial minds into one planetary soul.” Two years later Barlow, a self- described advocate for “online colonists,” got down on bended knee, doublet unbraced, to beseech us mere analog mortals: “Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone … You have no sovereignty where we gather.”

I take you on this quick tour not to make fun of futurism past (I have only slightly less-purple skeletons in my closet), but to point out how an idea that we have largely taken for granted is in fact the product of a very specific ideology. Despite its Department of Defense origins, the matrixed, hyperlinked Internet was both cause and effect of the libertarian ethos of Silicon Valley. The open-source mentality, in theory if not always in practice, proved useful for the tech and Internet worlds. Facebook and Twitter achieved massive scale quickly by creating an open system accessible to outside developers, though that openness is at times more about branding than anything else—as Twitter’s fellow travelers are now finding out. Mainframe behemoths like IBM wave the bloody shirt of Linux, the nonprofit open-source competitor of Microsoft Windows, any time they need to prove their bona fides to the tech community. Ironically, only the “old” entertainment and media industries, it seems, took open and free literally, striving to prove that they were fit for the digital era’s freewheeling information/entertainment bazaar by making their most expensively produced products available for free on the Internet. As a result, they undermined in little more than a decade a value proposition they had spent more than a century building up.

But now, it seems, things are changing all over again. The shift of the digital frontier from the Web, where the browser ruled supreme, to the smart phone, where the app and the pricing plan now hold sway, signals a radical shift from openness to a degree of closed-ness that would have been remarkable even before 1995. In the U.S., there are only three major cell-phone networks, a handful of smart-phone makers, and just one Apple, a company that has spent the entire Internet era fighting the idea of open (as anyone who has tried to move legally purchased digital downloads among devices can attest). As far back as the ’80s, when Apple launched the desktop-publishing revolution, the company has always made the case that the bourgeois comforts of an artfully constructed end-to-end solution, despite its limits, were superior to the freedom and danger of the digital badlands.

Apple, for once, is swimming with the tide. After 15 years of fruitless experimentation, media companies are realizing that an advertising-supported model is not the way to succeed on the Web and they are, at last, seeking to get consumers to pay for their content. They are operating on the largely correct assumption that people will be more likely to pay for consumer-friendly apps via the iPad, and a multitude of competing devices due out this year, than they are to subscribe to the same old kludgy Web site they have been using freely for years. As a result, media companies will soon be pushing their best and most timely content through their apps instead of their Web sites. Meanwhile, video-content services are finding that they don’t even need to bother with the Web and the browser. Netflix, for one, is well on its way to sending movies and TV shows directly to TV sets, making their customers’ experience virtually indistinguishable from ordering up on-demand shows by remote control. It’s far from a given that this shift will generate the kinds of revenue media companies are used to: for under-30s whelped on free content, the prospect of paying hundreds or thousands of dollars yearly for print, audio, and video (on expensive new devices that require paying AT&T $30 a month) is not going to be an easy sell.

Yet lack of uptake by young people will hardly stop the rush to apps. There’s too much potential upside. And with Apple in the driver’s seat, the rhetoric of “free” is becoming notably more muted. In rolling out the iPad, Steve Jobs has been aggressive and, to date, unapologetic about policing apps deemed unacceptable for the iPad store (or apps whose creators hold opinions that are anathema to Apple’s corporate interests or sense of universal order). And Apple has so far refused to enable Flash, the Adobe technology that runs 75 percent of all videos seen on the Web, and is launching its own ad-sales platform, presumably to control and monetize traffic on its devices.

On a more conceptual level, the move from the browser model to the app model (where content is more likely to be accessed via smartly curated “stores” like iTunes, Amazon, or Netflix) signals the first real taming of the Wild Digital West. Apple’s version of the West has nice white picket fences, clapboard houses, morals police, and lots of clean, well-organized places to spend money. (The Internet, it seems, is finally safe for Rupert Murdoch.) These shifts are seemingly subtle, but they may prove profound. Google, which built its once monopolistic position by harnessing the chaos of Web search, has been forced to move aggressively to preserve its business model against this new competition: it has teamed up with the Apple-scorned Flash; is making conciliatory gestures to the content owners it once patronized; has reached a deal to purchase a mobile ad-sales platform; and is promoting its own vision of the future based on cloud computing. Phones using its open-source smart-phone operating system, Android, are outselling the iPhone. Even so, Google still needs for the Web, however it’s accessed, to remain central—because without contextual search advertising, Google ceases to matter. Smart phones in general, and the iPad more pointedly, are not driven by search.

All of this suggests that the era of browser dominance is coming to a close. Twitter, like other recent-vintage social networks, is barely bothering with its Web site; its smart-phone app is more fully featured. The independent TweetDeck, which collates feeds across multiple social networks, is not browser-based. As app-based usage climbs at the expense of the browser and as more content creators put their text, audio, and video behind pay walls, it will be interesting to see what happens to the Twitterverse and blogosphere, which piggyback on, and draw creative juice from, their ability to link to free Web content. If they don’t end up licensing original content, networks such as Twitter and Facebook will become purely communication vehicles. At first glance, Web sites like The Daily Beast and The Huffington Post will have a hard time once they lose their ability to hypertext their digests; on second glance, they will have an opportunity to sop up some of the traffic that once went to their now-paid rivals. Google, meanwhile, is hoping to find ways to link through pay walls and across platforms, but this model will clearly not be the delightfully free-form open plain of the early Web. Years from now, we may look back at these past 15 years as a discrete (and thrillingly indiscreet) chapter in the history of digital media, not the beginning of a new and enlightened dispensation. The Web will be here forever; that is not in question. But as Don Henley sang in “The Last Resort,” the Eagles’ brilliant, haunting song about the resortification of the West, “You call someplace paradise, kiss it goodbye.”

Which brings us back to manifest destinies, physical and digital. As Patricia Limerick has argued in her reconsideration of frontier ideology, the moonstruck rhetoric of Manifest Destiny in the 1800s, though it may have been sincere, neatly papered over a host of less enlightened agendas. The surge west was a critical driver of economic growth, allowing the growing republic to harness vast amounts of natural resources and create new markets. The high-flown ideology of Manifest Destiny was, in short, a cover for a massive land grab (not to mention the slaughter of the Indians). The same is happening online. Now, instead of farmers versus ranchers, we have Apple versus Google. In retrospect, for all the talk of an unencumbered sphere, of a unified planetary soul, the colonization and exploitation of the Web was a foregone conclusion. The only question now is who will own it.

Michael Hirschorn is an Atlantic contributing editor.