On the Internet, We’re Always Famous

What happens when the experience of celebrity becomes universal?
A person with very large and furry ears looking at their phone while abstract shapes surrounds them.
Illustration by Jeffrey Kam

The fennec fox is the smallest fox on earth and cute as a button. It has mischievous dark eyes, a small black nose, and impish six-inch ears—each several times larger than its head. The fennec is native to the Sahara, where its comically oversized auricles play two key roles: they keep the fox cool in the baking sun (blood runs through the ears, releases heat, and circulates back through the body, now cooler), and they give the fox astoundingly good hearing, allowing it to pick up the comings and goings of the insects and reptiles it hunts for food.

The children’s section of the Bronx Zoo features a human-sized pair of fennec-fox ears that give an approximation of the fox’s hearing. Generations of New Yorkers have pictures of themselves with their chins resting on a bar between the two enormous, sculptural ears, taking in the sounds around them. I first encountered the ears as a kid, in the eighties. In my memory, inhabiting the fox’s hearing is disquieting. The exhibit is not in the middle of the Sahara on a moonlit night. The soundscape is not deathly quiet, dusted by the echoes of a lizard whooshing through the sand. The effect is instant sensory overload. You suddenly hear everything at once—snippets of conversation, shrieks, footsteps—all of it too much and too loud.

Imagine, for a moment, you find yourself equipped with fennec-fox-level hearing at a work function or a cocktail party. It’s hard to focus amid the cacophony, but with some effort you can eavesdrop on each and every conversation. At first you are thrilled, because it is thrilling to peer into the private world of another person. Anyone who has ever snuck a peek at a diary or spent a day in the archives sifting through personal papers knows that. Humans, as a rule, crave getting up in people’s business.

But something starts to happen. First, you hear something slightly titillating, a bit of gossip you didn’t know. A couple has separated, someone says. “They’ve been keeping it secret. But now Angie’s dating Charles’s ex!” Then you hear something wildly wrong. “The F.D.A. hasn’t approved it, but also there’s a whole thing with fertility. I read about a woman who had a miscarriage the day after the shot.” And then something offensive, and you feel a desire to speak up and offer a correction or objection before remembering that they have no idea you’re listening. They’re not talking to you.

Then, inevitably, you hear someone say something about you. Someone thinks it’s weird that you’re always five minutes late for the staff meeting, or wonders if you’re working on that new project that Brian started doing on the side, or what the deal is with that half-dollar-sized spot of gray hair on the back of your head. Injury? Some kind of condition?

Suddenly—and I speak from a certain kind of experience on this, so stay with me—the thrill curdles. If you overhear something nice about you, you feel a brief warm glow, but anything else will ball your stomach into knots. The knowledge is taboo; the power to hear, permanently cursed.

It would be better at this point to get rid of the fennec ears. Normal human socializing is impossible with them. But even if you leave the room, you can’t unhear what you’ve heard.

This is what the Internet has become.

It seems distant now, but once upon a time the Internet was going to save us from the menace of TV. Since the late fifties, TV has had a special role, both as the country’s dominant medium, in audience and influence, and as a bête noire for a certain strain of American intellectuals, who view it as the root of all evil. In “Amusing Ourselves to Death,” from 1985, Neil Postman argues that, for its first hundred and fifty years, the U.S. was a culture of readers and writers, and that the print medium—in the form of pamphlets, broadsheets, newspapers, and written speeches and sermons—structured not only public discourse but also modes of thought and the institutions of democracy itself. According to Postman, TV destroyed all that, replacing our written culture with a culture of images that was, in a very literal sense, meaningless. “Americans no longer talk to each other, they entertain each other,” he writes. “They do not exchange ideas; they exchange images. They do not argue with propositions; they argue with good looks, celebrities and commercials.”

This revulsion against the tyranny of TV seemed particularly acute in the early years of the George W. Bush Administration. In 2007, George Saunders wrote an essay about the bleating idiocy of American mass media in the era after 9/11 and the run-up to the Iraq War. In it, he offers a thought experiment that has stuck with me. Imagine, he says, being at a party, with the normal give and take of conversation between generally genial, informed people. And then “a guy walks in with a megaphone. He’s not the smartest person at the party, or the most experienced, or the most articulate. But he’s got that megaphone.”

The man begins to offer his opinions and soon creates his own conversational gravity: everyone is reacting to whatever he’s saying. This, Saunders contends, quickly ruins the party. And if you have a particularly empty-minded Megaphone Guy, you get a discourse that’s not just stupid but that makes everyone in the room stupider as well:

Let’s say he hasn’t carefully considered the things he’s saying. He’s basically just blurting things out. And even with the megaphone, he has to shout a little to be heard, which limits the complexity of what he can say. Because he feels he has to be entertaining, he jumps from topic to topic, favoring the conceptual-general (“We’re eating more cheese cubes—and loving it!”), the anxiety- or controversy-provoking (“Wine running out due to shadowy conspiracy?”), the gossipy (“Quickie rumored in south bathroom!”), and the trivial (“Which quadrant of the party room do YOU prefer?”).

Yes, he wrote that in 2007, and yes, the degree to which it anticipates the brain-goring stupidity of Donald Trump’s pronouncements is uncanny. Trump is the brain-dead megaphone made real: the dumbest, most obnoxious guy in the entire room given the biggest platform. And our national experiment with putting a D-level cable-news pundit in charge of the nuclear arsenal went about as horribly as Saunders might have predicted.

But Saunders’s critique runs deeper than the insidious triviality and loudness of major TV news, both before and after 9/11. He’s making the case that forms of discourse actually shape our conceptual architecture, that the sophistication of our thinking is determined to a large degree by the sophistication of the language we hear used to describe our world.

This is, of course, not a new contention: the idea that dumb media make us all dumber echoes from the very first critiques of newspapers, pamphlets, and the tabloid press in America, in the late eighteenth century, to the 1961 speech by then Federal Communications Commission Chair Newt Minow, in which he told the National Broadcasters of America that, basically, their product sucked and that TV amounted to a “vast wasteland.”

I thought, and many of us thought, that the Internet was going to solve this problem. The rise of the liberal blogs, during the run-up to Barack Obama’s election, brought us the headiest days of Internet Discourse Triumphalism. We were going to remake the world through radically democratized global conversations.

That’s not what happened. To oversimplify, here’s where we ended up. The Internet really did bring new voices into a national discourse that, for too long, had been controlled by far too narrow a group. But it did not return our democratic culture and modes of thinking to pre-TV logocentrism. The brief renaissance of long blog arguments was short-lived (and, honestly, it was a bit insufferable while it was happening). The writing got shorter and the images and video more plentiful until the Internet birthed a new form of discourse that was a combination of word and image: meme culture. A meme can be clever, even revelatory, but it is not discourse in the mode that Postman pined for.

As for the guy with the megaphone prattling on about the cheese cubes? Well, rather than take that one dumb guy’s megaphone away, we added a bunch of megaphones to the party. And guess what: that didn’t much improve things! Everyone had to shout to be heard, and the conversation morphed into a game of telephone, of everyone shouting variations of the same snippets of language, phrases, slogans—an endless, aural hall of mirrors. The effect is so disorienting that after a long period of scrolling through social media you’re likely to feel a profound sense of vertigo.

Not only that: the people screaming the loudest still get the most attention, partly because they stand out against the backdrop of a pendulating wall of sound that is now the room tone of our collective mental lives. Suffice it to say: the end result was not really a better party, nor the conversation of equals that many of us had hoped for.

Which, I think, brings us back to the fox ears.

The most radical change to our shared social lives isn’t who gets to speak, it’s what we can hear. True, everyone has access to their own little megaphone, and there is endless debate about whether that’s good or bad, but the vast majority of people aren’t reaching a huge audience. And yet at any single moment just about anyone with a smartphone has the ability to surveil millions of people across the globe.

The ability to surveil was, for years, almost exclusively the province of governments. In the legal tradition of the U.S., it was seen as an awesome power, one that was subject to constraints, such as warrants and due process (though often those constraints were more honored in the breach). And not only that, freedom from ubiquitous surveillance, we were taught in the West, was a defining feature of Free Society. In totalitarian states, someone or something was always listening, and the weight of that bore down on every moment of one’s life, suffocating the soul.

Well, guess what? We have now all been granted a power once reserved for totalitarian governments. A not particularly industrious fourteen-year-old can learn more about a person in a shorter amount of time than a team of K.G.B. agents could have done sixty years ago. The teen could see who you know, where you’ve been, which TV shows you like and don’t like; the gossip that you pass along and your political opinions and bad jokes and feuds; your pets’ names, your cousins’ faces, and your crushes and their favorite haunts. With a bit more work, that teen could get your home address and your current employer. But it’s the ability to access the texture of everyday life that makes this power so awesome. It’s possible to get inside the head of just about anyone who has a presence on the social Web, because chances are they are broadcasting their emotional states in real time to the entire world.

So total is the public presence of our private lives that even those whose jobs depend on total privacy cannot escape its reach. The open-source intelligence outfit Bellingcat has used this fact to track down a wide array of global malefactors, including the two Russian agents who appear to have poisoned a Russian defector in the U.K., Sergei Skripal, with a nerve agent, in 2018. Bellingcat was able to identify both men through data it purchased on the gray market, obtaining their aliases and photos of each. But the breakthrough came when it was discovered that one suspect had attended the wedding of the daughter of their G.R.U. unit’s commander. In a video—posted on Instagram, of course—the commander walks his daughter down the aisle on a lovely dock, to the sounds of a bossa nova cover of “Every Breath You Take.”

The young couple didn’t just post clips of their wedding (which was gorgeous, by the way) to Instagram. They also uploaded a highly stylized video, set to upbeat music, that shows them in bathrobes getting ready for the ceremony as well as the big moments of the wedding itself. To establish the suspect’s attendance at the ceremony, Bellingcat scanned other posted snapshots of the wedding and compared them with images in the video. Sure enough, the identity of the man in question, Anatoliy Chepiga, matched that of the alias he’d used to travel to the U.K. for the attempted murder.

Bellingcat published its findings, and, presumably, a whole host of Russian military and intelligence officials—maybe all the way up the chain to Vladimir Putin—realized that the utterly innocuous social media posts of a happy young couple had tripped off the identification of someone indicted for attempted murder and wanted by the British authorities.

This is an extreme example of a common phenomenon. Someone happens upon a social-media artifact of a person with a tiny number of followers and sends it shooting like a firework into the Internet, where it very briefly burns white-hot in infamy. There are some who find the sudden attention thrilling and addictive: this will be their first taste of a peculiar experience they then crave and chase. And there are others, like our newlyweds, who very much do not want the attention. They belatedly try to delete the post or make it “private,” but by then it’s too late for privacy. A message they intended for friends and family, people they have relationships with, ended up in the hands of strangers, people who don’t know them at all.

Never before in history have so many people been under the gaze of so many strangers. Humans evolved in small groups, defined by kinship: those we knew, knew us. And our imaginative capabilities allowed us to know strangers—kings and queens, heroes of legend, gods above—all manner of at least partly mythic personalities to whom we may have felt as intimately close to as kin. For the vast majority of our species’ history, those were the two principal categories of human relations: kin and gods. Those we know who know us, grounded in mutual social interaction, and those we know who don’t know us, grounded in our imaginative powers.

But now consider a third category: people we don’t know and who somehow know us. They pop up in mentions, comments, and replies; on subreddits, message boards, or dating apps. Most times, it doesn’t even seem noteworthy: you look down at your phone and there’s a notification that someone you don’t know has liked a post. You might feel a little squirt of endorphin in the brain, an extremely faint sense of achievement. Yet each instance of it represents something new as a common human experience, for their attention renders us tiny gods. The Era of Mass Fame is upon us.

If we define fame as being known to many people one doesn’t know, then it is an experience as old as human civilization. Stretching back to the first written epic, Gilgamesh (whose protagonist was, in fact, an actual king), history, particularly as it is traditionally taught, is composed almost entirely of the exploits of the famous: Nefertiti, Alexander the Great, Julius Caesar, Muhammad, and Joan of Arc.

But as the critic Leo Braudy notes, in his 1987 study, “The Frenzy of Renown,” “As each new medium of fame appears, the human image it conveys is intensified and the number of individuals celebrated expands.” Industrial technology—newspapers and telegraphs, followed by radio, film, and TV—created an ever-larger category of people who might be known by millions the world over: politicians, film stars, singers, authors. This category was orders of magnitude larger than it had been in the pre-industrial age, but still a nearly infinitesimal portion of the population at large.

All that has changed in the past decade. In the same way that electricity went from a luxury enjoyed by the American élite to something just about everyone had, so, too, has fame, or at least being known by strangers, gone from a novelty to a core human experience. The Western intellectual tradition spent millennia maintaining a conceptual boundary between public and private—embedding it in law and politics, norms and etiquette, theorizing and reinscribing it. With the help of a few tech firms, we basically tore it down in about a decade.

That’s not to say the experience of being known, paid attention to, commented on by strangers, is in any sense universal. It’s still foreign to most people, online and off. But now the possibility of it haunts online life, which increasingly is just life. The previous limiting conditions on what’s private and what’s public, on who can know you, have been lifted. In the case of our young Russian lovebirds, one might safely assume that, until Bellingcat started snooping around their wedding videos, they had been spared the experience of the sudden burst of Internet fame. But, like them, just about everyone is always dancing at the edge of that cliff, oblivious or not.

This has been entirely internalized by the generation who’ve come of age with social media. A clever TikTok video can end up with forty million views. With the possibility of this level of exposure so proximate, it’s not surprising that poll after poll over the past decade indicates that fame is increasingly a prime objective of people twenty-five and younger. Fame itself, in the older, more enduring sense of the term, is still elusive, but the possibility of a brush with it functions as a kind of pyramid scheme.

This, perhaps, is the most obviously pernicious part of the expansion of celebrity: ever since there have been famous people, there have been people driven mad by fame. In the modern era, it’s a cliché: the rock star, comedian, or starlet who succumbs to addiction, alienation, depression, and self-destruction under the glare of the spotlight. Being known by strangers, and, even more dangerously, seeking their approval, is an existential trap. And right now, the condition of contemporary life is to shepherd entire generations into this spiritual quicksand.

As I’ve tried to answer the question of why we seek out the likes and replies and approval of strangers, and why this so often drives both ordinary and celebrated people toward breakdowns, I’ve found myself returning to the work of a Russian émigré philosopher named Alexandre Kojève, whose writing I first encountered as an undergraduate. In 1933, Kojève took over the teaching of a seminar on Hegel at the École Pratique des Hautes Études, in Paris. Though Kojève would live his life in relative obscurity, ultimately becoming a civil servant in the French trade ministry and helping to construct the architecture for a common Europe, his seminar on Hegel’s “Phenomenology of Spirit” was almost certainly the most influential philosophy class of the twentieth century. A Who’s Who of Continental thinkers, from Sartre to Lacan, passed through, and Kojève’s grand intellectual synthesis would deeply influence their work.

In his lectures, Kojève takes up Hegel’s famous meditation on the master-slave relationship, recasting it in terms of what Kojève sees as the fundamental human drive: the desire for recognition—to be seen, in other words, as human by other humans. “Man can appear on earth only within a herd,” Kojve writes. “That is why the human reality can only be social.”

Understanding the centrality of the desire for recognition is quite helpful in understanding the power and ubiquity of social media. We have developed a technology that can create a synthetic version of our most fundamental desire. Why did the Russian couple post those wedding photos? Why do any of us post anything? Because we want other humans to see us, to recognize us.

But We Who Post are trapped in the same paradox that Kojève identifies in Hegel’s treatment of the Master and Slave. The Master desires recognition from the Slave, but because he does not recognize the Slave’s humanity, he cannot actually have it. “And this is what is insufficient—what is tragic—in his situation,” Kojève writes. “For he can be satisfied only by recognition from one whom he recognizes as worthy of recognizing him.”

I’ve found that this simple formulation unlocks a lot about our current situation. It articulates the paradox of what we might call not the Master and the Slave but, rather, the Star and the Fan. The Star seeks recognition from the Fan, but the Fan is a stranger, who cannot be known by the Star. Because the Star cannot recognize the Fan, the Fan’s recognition of the Star doesn’t satisfy the core existential desire. There is no way to bridge the inherent asymmetry of the relationship, short of actual friendship and correspondence, but that, of course, cannot be undertaken at the same scale. And so the Star seeks recognition and gets, instead, attention.

The Star and the Fan are prototypes, and the Internet allows us to be both in different contexts. In fact this is the core, transformative innovation of social media, the ability to be both at once. You can interact with strangers, not just view them from afar, and they can interact with you. Those of us who have a degree of fame have experienced the lack of mutuality in these relationships quite acutely: the strangeness of encountering a person who knows you, who sees you, whom you cannot see in the same way.

We are conditioned to care about kin, to take life’s meaning from the relationships with those we know and love. But the psychological experience of fame, like a virus invading a cell, takes all of the mechanisms for human relations and puts them to work seeking more fame. In fact, this fundamental paradox—the pursuit through fame of a thing that fame cannot provide—is more or less the story of Donald Trump’s life: wanting recognition, instead getting attention, and then becoming addicted to attention itself, because he can’t quite understand the difference, even though deep in his psyche there’s a howling vortex that fame can never fill.

This is why famous people as a rule are obsessed with what people say about them and stew and rage and rant about it. I can tell you that a thousand kind words from strangers will bounce off you, while a single harsh criticism will linger. And, if you pay attention, you’ll find all kinds of people—but particularly, quite often, famous people—having public fits on social media, at any time of the day or night. You might find Kevin Durant, one of the greatest basketball players on the planet, possibly in the history of the game—a multimillionaire who is better at the thing he does than almost any other person will ever be at anything—in the D.M.s of some twentysomething fan who’s talking trash about his free-agency decisions. Not just once—routinely! And he’s not the only one at all.

There’s no reason, really, for anyone to care about the inner turmoil of the famous. But I’ve come to believe that, in the Internet age, the psychologically destabilizing experience of fame is coming for everyone. Everyone is losing their minds online because the combination of mass fame and mass surveillance increasingly channels our most basic impulses—toward loving and being loved, caring for and being cared for, getting the people we know to laugh at our jokes—into the project of impressing strangers, a project that cannot, by definition, sate our desires but feels close enough to real human connection that we cannot but pursue it in ever more compulsive ways.

So here we are, our chins pressed into the metal holster between the fennec-fox ears, the constant flitting words and images of strangers entering our sensory system, offering our poor desiring beings an endless temptation—a power we should not have and that cannot make us whole.

An earlier version of this article misspelled Newt Minow’s name and incorrectly described the substance involved in the poisoning of Sergei Skripal.


New Yorker Favorites