Blame social media again —

Distraction, not partisanship, drives sharing of misinformation

But getting people to pay attention to news quality doesn't seem to help much.

The words

We don't need a study to know that misinformation is rampant on social media; we just need to do a search for "vaccines" or "climate change" to confirm that. A more compelling question is why. It's clear that, at a minimum, there are contributions from organized disinformation campaigns, rampant political partisans, and questionable algorithms. But beyond those, there are still a lot of people who choose to share stuff that even a cursory examination would show was garbage. What's driving them?

That was the question that motivated a small international team of researchers who decided to take a look at how a group of US residents decided on which news to share. Their results suggest that some of the standard factors that people point to when explaining the tsunami of misinformation—inability to evaluate information and partisan biases—aren't having as much influence as most of us think. Instead, a lot of the blame gets directed at people just not paying careful attention.

You shared that?

The researchers ran a number of fairly similar experiments to get at the details of misinformation sharing. This involved panels of US-based participants recruited either through Mechanical Turk or via a survey population that provided a more representative sample of the US. Each panel had several hundred to over 1,000 individuals, and the results were consistent across different experiments, so there was a degree of reproducibility to the data.

To do the experiments, the researchers gathered a set of headlines and lead sentences from news stories that had been shared on social media. The set was evenly mixed between headlines that were clearly true and clearly false, and each of these categories was split again between those headlines that favored Democrats and those that favored Republicans.

One thing that was clear is that people are generally capable of judging the accuracy of the headlines. There was a 56 percentage point gap between how often an accurate headline was rated as true and how often a false headline was. People aren't perfect—they still got things wrong fairly often—but they're clearly quite a bit better at this than they're given credit for.

The second thing is that ideology doesn't really seem to be a major factor in driving judgements on whether a headline was accurate. People were more likely to rate headlines that agreed with their politics, but the difference here was only 10 percentage points. That's significant (both societally and statistically), but it's certainly not a large enough gap to explain the flood of misinformation.

But when the same people were asked about whether they'd share these same stories, politics played a big role, and the truth receded. The difference in intention to share between true and false headlines was only six percentage points. Meanwhile the gap between whether a headline agreed with a person's politics or not saw a 20 percentage point gap. Putting it in concrete terms, the authors look at the false headline "Over 500 'Migrant Caravaners' arrested with suicide vests." Only 16 percent of the conservatives in the survey population rated it as true. But over half of them were amenable to sharing it on social media.

Overall, the participants were twice as likely to consider sharing a false headline that was aligned with their politics than they were to rate them as accurate. Yet amazingly, when the same population was asked about whether it's important to only share accurate content on social media, the most common answer was "extremely important."

What’s going on here?

So, people can distinguish what's accurate, and say sharing what's accurate is important, but when it comes down to actually making the decision to share, accuracy doesn't matter much. Or, as the researchers put it, something about the social media context shifts people's attention away from caring about the truth, and onto the desire to get likes and signal their ideological affiliation.

To get at whether this might be the case, the researchers altered the experiment slightly to remind people about the importance of accuracy. In their modified survey, they started off by asking people to rate the accuracy of a non-partisan news headline, which should make participants more conscious of the need for and process of making those sorts of judgements. Those who received this prompt were less likely to report that they were interested in sharing fake news headlines, especially when said headlines agreed with their politics. Similar things occurred when people were simply asked about the importance of accuracy before taking the survey, rather than after.

All of this is consistent with the idea that people do value accuracy but don't necessarily think much about it when they're using social media. Overall, the researchers estimate that it accounts for about half of the decisions to share misinformation. By contrast, inability to identify misinformation accounts for less than a third, and partisan influences explain 16 percent.

Finally, the researchers did a bit of a real-world experiment, contacting over 5,000 Twitter users who had previously shared links to Breitbart or Infowars, two major sources of inaccurate, partisan information. The researchers asked these users to rate the accuracy of a single, non-partisan headline in the hope that it would act as a nudge to get them to consider accuracy before sharing something.

And the nudge apparently worked. Overall, the quality of the news sources behind the articles shared by these people edged up by five percent. But that worked out to mean they were 2.8 times more likely to share material from mainstream news sites.

Not just one problem

The overall conclusion here is in keeping with a lot of prior research, so it's not especially surprising. There's extensive experimentation showing that people tend to reach snap judgements that tend to signal their cultural and ideological affinities; the mental energy they should expend in evaluating these snap judgements instead typically gets directed to defending them after they're made. It's easy to square this with the overall conclusion that, when they're not specifically focused on accuracy, partisanship plays a large role.

While this may be the largest single factor here, however, it's clearly not the only one; inability to judge accuracy also plays a major role, and there are clearly some cases where partisan concerns outweigh accuracy. That last case is probably worth looking at in far more detail, and we could potentially get important information from the data the researchers already have. Is this group driven by specific stories that partisans found important to advance? Or is it driven by a small number of people who consistently choose to share partisan stories regardless of their accuracy?

The last thing that's clear is that there's no easy solution here. While a nudge can get people to shift their behavior a bit, it falls far short of eliminating the problem. And it won't have any influence on the large number of accounts that exist solely to take part in organized misinformation campaigns.

Nature, 2021. DOI: 10.1038/s41586-021-03344-2  (About DOIs).

Listing image by Lewis Ogden / Flickr

Channel Ars Technica