Advertisement

SKIP ADVERTISEMENT

Why the Fact-Checking at Facebook Needs to Be Checked

Since the 2016 presidential campaign, Facebook has taken a number of actions to prevent the continued distribution of false news articles on its platform, most notably by labeling articles rated as false or misleading by fact checkers as “disputed.”

But how effective are these measures?

To date, Facebook has offered little information to the public other than a recent email to fact checkers asserting that labeled articles receive 80 percent fewer impressions. But more data is necessary to determine the success of these efforts. Research (including my own) suggests the need to carefully evaluate the effectiveness of Facebook’s interventions.

Yale’s Gordon Pennycook and David Rand offer two principal warnings. First, the effects of exposure to false information are not easily countered by labeling, as they find in a paper they wrote with Tyrone D. Cannon. False information we have previously encountered feels more familiar, producing a feeling of fluency that causes us to rate it as more accurate than information we have not seen before. This effect persists even when Facebook-style warnings label a headline “disputed.” We should be cautious about assuming that labels tagging articles as false are enough to prevent misinformation on social media from affecting people’s beliefs.

Image
Images from the survey Brendan Nyhan and his students used as research for their paper about the effects of labeling news as fake. (For the record, Mr. Trump did not plagiarize “Bee Movie” in his inaugural address.)

In a second paper, Mr. Pennycook and Mr. Rand find that the presence of “disputed” labels causes study participants to rate unlabeled false stories as slightly more accurate — an “implied truth” effect. If Facebook is seen as taking responsibility for the accuracy of information in its news feed through labeling, readers could start assuming that unlabeled stories have survived scrutiny from fact checkers (which is rarely correct — there are far too many for humans to check everything).

Encouragingly, my students at Dartmouth College and I find that the effects of Facebook-style “disputed” banners on the perceived accuracy of false headlines are larger than those Mr. Pennycook and Mr. Rand observed. The proportion of respondents rating a false headline as “somewhat” or “very accurate” in our study decreased to 19 percent with the standard Facebook “disputed” banner, from 29 percent in the unlabeled condition. It goes down even further, to 16 percent, when the warning instead states that the headline is “rated false.” (We find no evidence of a large “implied truth” effect, though we lack the statistical precision necessary to detect effects of the size Mr. Pennycook and Mr. Rand measure.)

Other results suggest further reason for caution. Back in April, Facebook provided users with “Tips to Spot False News” in an article that was linked above users’ news feeds. We find that general warnings of this sort are also potentially counterproductive. When we showed a warning reminding users to “remain skeptical” and “think critically” to spot misleading articles, it did reduce belief in the accuracy of false news headlines somewhat — but it also somewhat reduced the perceived accuracy of news articles that were not false. Readers seemed to become more skeptical of all the headlines that they encountered. (A similar effect was not observed for the fact-check banners.)

Given these challenges, Facebook’s partnership with fact checkers may be most effective in providing information to the platform’s news feed algorithm, which allows the company to reduce the prominence of articles that fact checkers have rated as false or misleading. As the studies we’ve cited suggest, changing human beliefs is far harder than providing input to a computer program. But it seems we should be cautious about placing too much trust in a private algorithm that works most effectively by suppressing information, especially without further evaluation of its effects. Ultimately, the fight against false news must be won in public.

Brendan Nyhan is a professor of government at Dartmouth College. Follow him on Twitter at @BrendanNyhan.

The Upshot provides news, analysis and graphics about politics, policy and everyday life. Follow us on Facebook and Twitter. Sign up for our newsletter.

A version of this article appears in print on  , Section B, Page 2 of the New York edition with the headline: Is Facebook’s Fact-Checking Effective? Researchers Offer Warnings. Order Reprints | Today’s Paper | Subscribe

Advertisement

SKIP ADVERTISEMENT