Facebook reportedly testing new tool to combat fake news
Facebook appears to be testing a tool designed to help it identify and hide so called “fake news” on the social network, in an attempt to quell increasingly vocal criticism of its role in spreading untruths and propaganda.
The tool, reported by at least three separate Facebook users on Twitter, asks readers to rank on a scale of one to five the extent to which they think a link’s title “uses misleading language”. The articles in question were from reliable sources: Rolling Stone magazine, the Philadelphia Inquirer, and Chortle, a news site which reports on comedy.
It isn’t clear how Facebook intends to act on the data it is collecting, or whether it intends to act at all. Misleading link text is certainly a part of the fake news problem on the social network, as evidenced by the two misleading adverts that accompanied Facebook chief executive Mark Zuckerberg’s 18 November post about fake news. (That post was later temporarily deleted by Facebook, before the site acknowledged the “system error”).
The problem of misleading links is compounded by Facebook’s user interface, which serves to de-emphasise links to external sources in favour of encouraging users to like, share or comment on the site itself. Research suggests that almost 60% of social media shares come from users who never clicked the link, implying that the headline drives discussion and sharing far more than the content of an article.
At the same time, much of the conversation around fake news has focused on articles and publications with many more problems than simple misleading headlines. The “Pizzagate” conspiracy theory, which resulted in a self-radicalised gunman discharging his weapon in a popular pizza restaurant in Washington DC on Sunday, was spread with the help of a string of fake news stories falsely accusing the owners of being part of a made up paedophile ring with supposed ties to Hillary Clinton.
While the Pizzagate stories ranged from misleading to outright fabricated, the headlines on them were accurate summations of their content, suggesting that readers from Facebook who clicked through would end up ranking them as highly trustworthy links under the site’s experimental system.