Google yesterday announced it will introduce a fact check tag on Google News in order to display articles that contain factual information next to trending news items. Now it’s time for Facebook to take fact-checking more seriously, too.
Facebook has stepped into the role of being today’s newspaper: that is, it’s a single destination where a large selection of news articles are displayed to those who visit its site. Yes, they appear amidst personal photos, videos, status updates, and ads, but Facebook is still the place where nearly half of American adults get their news.
Facebook has a responsibility to do better, then, when it comes to informing this audience what is actually news: what is fact-checked, reported, vetted, legitimate news, as opposed to a rumor, hoax or conspiracy theory.
It’s not okay that Facebook fired its news editors in an effort to appear impartial, deferring only to its algorithms to inform readers what’s trending on the site. Since then, the site has repeatedly trended fake news stories, according to a Washington Post report released earlier this week.
The news organization tracked every news story that trended across four accounts during the workday from August 31 to September 22, and found that Facebook trended five stories that were either “indisputably fake” or “profoundly inaccurate.” It also regularly featured press releases, blog posts, and links to online stores, like iTunes – in other words, trends that didn’t point to news sites.
Facebook claimed in September that it would roll out technology that would combat fake stories in its Trending topics, but clearly that has not yet come to pass – or the technology isn’t up to the task at hand.
In any event, Facebook needs to do better.
It’s not enough for the company to merely reduce the visibility of obvious hoaxes from its News Feed – not when so much of the content that circulates on the site is posted by people – your friends and family – right on their profiles, which you visit directly.
Plus, the more the items are shared, the more they have the potential to go viral. And viral news becomes Trending news, which is then presented all Facebook’s users in that region.
This matters. Facebook has trended a story from a tabloid news source that claimed 9/11 was an inside job involving planted bombs. It ran a fake story about Fox News anchor Megyn Kelly which falsely claimed she was fired. These aren’t mistakes: they are disinformation.
Facebook has apologized for the above, but declined to comment to The Washington Post regarding its new findings that fake news continues to be featured on the platform.
In addition, not only does Facebook fail at vetting its Trending news links, it also has no way of flagging the links that fill its site.
Outside of Trending, Facebook continues to be filled with inaccurate, poorly-sourced, or outright fake news stories, rumors and hoaxes. Maybe you’re seeing less of them in the News Feed, but there’s nothing to prevent a crazy friend from commenting on your post with a link to a well-known hoax site, as if it’s news. There’s no tag or label. They get to pretend they’re sharing facts.
Meanwhile, there’s no way for your to turn off commenting on your own posts, even when the discussion devolves into something akin to “sexual assault victims are liars” (to reference a recent story.)
Because perish the thought that Facebook would turn of the one mechanism that triggers repeat visits to its site, even if that means it would rather trigger traumatic recollections on the parts of its users instead.
There is a difference between a post that’s based on fact-checked articles, and a post from a website funded by an advocacy group. There’s a difference between Politifact and some guy’s personal blog. Facebook displays them both equally, though: here’s a headline, a photo, some summary text.
Of course, it would be a difficult job for a company that only wants to focus on social networking and selling ads to get into the media business – that’s why Facebook loudly proclaims it’s “not a media company.”
Except that it is one. It’s serving that role, whether it wants to or not.
Google at least has stepped up to the plate and is trying to find a solution. Now it’s Facebook’s turn.
Facebook may have only unintentionally become a media organization, but it is one. And it’s doing a terrible job.
No comments:
Post a Comment