Google yesterday announced it will introduce a fact check tag on Google News in order to display articles that contain factual information next to trending news items. Now it’s time for Facebook to take fact-checking more seriously, too.
Facebook has stepped into the role of being today’s newspaper: that is, it’s a single destination where a large selection of news articles are displayed to those who visit its site. Yes, they appear amidst personal photos, videos, status updates, and ads, but Facebook is still the place where nearly half of American adults get their news.
Facebook has a responsibility to do better, then, when it comes to informing this audience what is actually news: what is fact-checked, reported, vetted, legitimate news, as opposed to a rumor, hoax or conspiracy theory.
It’s not okay that Facebook fired its news editors in an effort to appear impartial, deferring only to its algorithms to inform readers what’s trending on the site. Since then, the site has repeatedly trended fake news stories, according to a Washington Post report released earlier this week.
The news organization tracked every news story that trended across four accounts during the workday from August 31 to September 22, and found that Facebook trended five stories that were either “indisputably fake” or “profoundly inaccurate.” It also regularly featured press releases, blog posts, and links to online stores, like iTunes – in other words, trends that didn’t point to news sites.
Facebook claimed in September that it would roll out technology that would combat fake stories in its Trending topics, but clearly that has not yet come to pass – or the technology isn’t up to the task at hand.
In addition, not only does Facebook fail at vetting its Trending news links, it also has no way of flagging the links that fill its site.
Outside of Trending, Facebook continues to be filled with inaccurate, poorly-sourced, or outright fake news stories, rumors and hoaxes. Maybe you’re seeing less of them in the News Feed, but there’s nothing to prevent a crazy friend from commenting on your post with a link to a well-known hoax site, as if it’s news. There’s no tag or label. They get to pretend they’re sharing facts.
Meanwhile, there’s no way for your to turn off commenting on your own posts, even when the discussion devolves into something akin to “sexual assault victims are liars” (to reference a recent story.)
Because perish the thought that Facebook would turn of the one mechanism that triggers repeat visits to its site, even if that means it would rather trigger traumatic recollections on the parts of its users instead.
There is a difference between a post that’s based on fact-checked articles, and a post from a website funded by an advocacy group. There’s a difference between Politifact and some guy’s personal blog. Facebook displays them both equally, though: here’s a headline, a photo, some summary text.