The Fake News Culprit No One Wants to Identify: You

Facebook and Twitter won't fix this problem alone, says danah boyd. Today's information wars are also a reflection of us.
This image may contain Human Person and Art
MHJ / iStock

The other week, Facebook chose a curious moment to give me a survey. I had just deleted the app from my phone, likely because of some fresh horror about ad targeting, and when I next pulled up the site in my browser, I got this message: “Please agree or disagree with the following statement: Facebook is good for the world.” I rolled my eyes, “strongly disagreed,” and logged out of my browser. But eight hours later, I was back to scrolling through my News Feed. This pattern isn’t new: I’ve spent much of the last year insisting to anyone who’ll listen that Facebook, Twitter, and the like will be responsible for the demise of democracy—while being drawn back to the feeds, again and again.

There’s an instinct to point fingers; to find someone to blame for the information hellscape in which we now find ourselves. Every day one tech giant or another is forced to play defense, whether it’s Facebook being called out yet again for letting advertisers exclude audiences by race or Twitter bending to the whims of white nationalists who want to target reporters. Because we can’t quit the products, we become desperate for the companies to save us from ourselves.

That’s not going to happen, argues Data & Society founder and Microsoft researcher danah boyd. Google, Facebook, Twitter—none of these companies is sitting on a silver-bullet solution. As boyd wrote for us earlier this year, we have more than a technology problem: “[W]e have a cultural problem, one that is shaped by disconnects in values, relationships, and social fabric. Our media, our tools, and our politics are being leveraged to help breed polarization by countless actors who can leverage these systems for personal, economic, and ideological gain.” I spoke with boyd about the shifting public discourse around online disinformation campaigns, and what role the tech industry should play in rebuilding American society.

Miranda Katz: Back in March, the debate over fake news and what tech companies like Google and Facebook should be doing about it felt like it was reaching a fever pitch. You wrote a piece for us arguing that we can’t just look to the tech companies to fix fake news: We have to understand it as a cultural problem, too. That debate hasn't let up. Do you think it's still overly focused on finding a technological solution?

danah boyd: I think that it's still absolutely focused on the idea that technology will solve our way out of this. I think that we're still not taking a true public accounting of all of the different cultural factors that are at play. What's really striking about what's at stake is that we have an understanding of our American society and of there being a rational, bureaucratic process around democracy. But now there are such notable societal divisions, and rather than trying to bridge them, trying to remedy them, trying to figure out why people's emotions are speaking past one another, it's about looking for a blame, looking for somebody that we can hold responsible without holding ourselves individually and collectively responsible. Unfortunately, that's going to do squat. And, for the most part, we’re looking for something new to blame, which is why so much of the attention is focused on technology companies instead of politics, news media, or our economic incentives. We need to hold ourselves individually and collectively responsible, but that’s not where people are at.

We're not seeing something that is brand new. We're just distraught because hatred, prejudice, and polarization are now extraordinarily visible, and that the people who have power in this moment are not the actors that some of us believe should have power. And, of course, technology mirrors and magnifies the good, bad, and ugly of everyday life. There’s a peculiar contradiction and challenge of what we’ve built [with these platforms]. So many early internet creators hoped to build a decentralized system that would allow anybody to have power. We didn't account for the fact that the class of people who might leverage this strategically may do so for nefarious, adversarial, or destructive purposes.

On top of fake news, we're now also grappling with these bigger questions of foreign interference and troubling political ad targeting. And we’re still pointing fingers at Google and Facebook, and demanding a fix. What do you make of that response?

I'm not going to say that foreign interference is acceptable, but I am going to say that we've got bigger problems that we're not willing to address. And now we want to create a bogeyman. When it comes to Facebook, I have no doubt that a whole lot of people received content by a whole set of adversarial actors. What I think the main response has been is for most people to just distrust their information landscape. The reason why Russia is relevant in all of this is because Russia is notorious for relishing opportunities to cause people to distrust information landscapes. That has been their approach from a state position for quite some time. So in some ways, our panic about this just did the work for them. A news media obsessing over Russia just did the work that Russians were trying to do far better than any Facebook ads they could have bought.

In your book It’s Complicated, you write about how social media, like any new technology, tends to spark a moral panic at first—but usually that dies down. Social media has been around for some time now, and it seems like every day there's a new panic over its implications. Do you think it's becoming an exception to that rule?

Moral panics last for a while. This is a shifting one, and there's a variety of proxy panics going on. Do you think that #MeToo would have happened if we hadn't elected Trump? It’s not like there haven't been creepy men for a very long time. But because we can't challenge the lecherous behavior of our president who's fully admitted to being a sexual harasser, we're going to proxy fight with all of the other creepy men out there. It's not simply a moral panic. It's a proxy panic. We don't know how to talk about failings of financialized capitalism. We don't know how to talk about failings of our political infrastructure. We don't know how to talk about massive polarization in our public.

How do we reconcile knowing that—being aware of the fact that our panic over social media and disinformation is really a proxy panic for something much bigger—with the fact that we do still want these tech companies to stop playing defense and begin tackling these problems proactively?

I absolutely believe we do. But I don't think it's a silver bullet. And the efforts that they're making so far are just what they need to do as a baseline response under pressure. They don't have the incentive structures to fix the underlying problems, just like our political establishment doesn't, and just like our financial ecosystem doesn't. I think expecting them to do it on their own is naïve. Part of it is like, what would it take to restructure the configuration of finance, political governance, and corporate activity for something that's a public good? It's a complicated question. I think that yes, of course, they should be doing a lot more. Yes, of course, there should be mounting pressure. And there's nothing like shame to actually push on that. But I think that we are focusing on them without actually accounting for the bigger picture. We’re not even looking at how their structure as a financialized global company forces them to make decisions that are not in the interests of any nation-state citizens.

So where do we go from here?

It's actually really clear: How do you reknit society? Society is produced by the social connections that are knit together. The stronger those networks, the stronger the society. We have to make a concerted effort to create social ties, social relationships, social networks in the classic sense that allow for strategic bridges across the polis so that people can see themselves as one. And one of the things we don't account for in our history as a country is that we did a lot of this instinctively. The creation of the US military was actually a very specific strategic networked part of America's fabric. It allowed you meet people across every line. The way in which we've done higher ed historically has actually created an unbelievable network. Missionary work is another one. Part of what is really collapsing here is that the networks have become too fragmented and too polarized. Technology doesn’t help; it simply magnifies the poles. This is dangerous and cyclical. Polarization leads to distrust and tribalism which leads to more polarization.

So for me, the path forward, which requires business and the public sector and civil society working together, is about reconstructing the networks of America. I think that one of the mistakes that people in the tech sector have made is that they realized the importance of connecting people across distance—but they thought that it would happen naturally if they just made it possible. And they were wrong. They were wrong to say that people would actively connect to those who were different than them because they could through technology. You actually have to make it intentional. I think there's a lot that the tech sector can and should do around this. No one has a better model of the networks of America than those tech companies. No one understands better where the disconnects are. What would it mean to actually understand and seek to remedy the divisions? But I don't know that that can be done in a financialized way. Actually, I know it can't be done in a financialized way. I want regulators to work toward rebuilding the networks of America. Not regulate toward fixing an ad.