Facebook outsourced its fake news problem, and it appears the company isn't keeping its fact-checkers fully satisfied. According to a report from Politico, third-party fact-checkers want more internal data from Facebook both on how impactful their work is on the platform and also on which stories they should prioritise. Facebook isn't handing the information over, and fact-checkers are reportedly irked.
"I would say that the general lack of information — not only data — given by Facebook is a concern for a majority of publishers," Adrien Sénécat, a journalist at Le Monde, a news organisation that partnered with Facebook as part of the fact-checking project, told Politico.
In March, Facebook debuted a flagging system in which its third-party fact-checkers could tag individual stories that they found weren't factually accurate as "disputed". Hoaxes aren't eliminated from the site, but rather include links to factual information.
Executive director of PolitiFact Aaron Sharockman told Politico that there are upwards of 1500 stories that fact-checkers could look at in a given day, but that due to a "pretty time intensive" process, they may only look at two, which is why knowing which ones to prioritise is so important.
Facebook is arguing that it isn't handing over the internal data to the fact-checkers because it's looking out for the privacy of its users.
"I think it's hard to strike the balance. We all have the same objective, to prevent false news from reaching people on our platform," Facebook News Feed product manager Sara Su told Politico. "We want to be as transparent as we can be while also respecting the privacy of people on our platform."
Facebook has not approached the spread of misinformation on its platform with much urgency. In November, Mark Zuckerberg said it was a "pretty crazy idea" that fake news could influence the outcome of the election, but a Buzzfeed report found that stories from fake news sources received more engagement than those from legitimate ones in the weeks leading up to the US presidential election.
Zuckerberg finally acknowledged his creation's influence in a Facebook post in December, promising to better fight misinformation on the social networking site with the introduction of third-party fact-checkers. Outsourcing the issue is a great way for Facebook to avoid the perception of censorship. It also gives the company an air of action without any of the accountability.
Even if Facebook did hand over the internal data to the fact-checkers, the question remains — what is the purpose of this fake news system?
In its current form, a hoax can still thrive on the platform, it may just include a label signalling that it's inaccurate as well as some debunker information. While that's certainly helpful for those actively seeking out the truth, wouldn't it be more helpful to eliminate inaccurate links altogether? But Facebook, arguably the most influential news service in existence, won't do that — in addition to avoiding accountability like the plague, deleting fake stories would force Facebook to sacrifice some of its sweet daily active users who might view the move as censorship. Bottom line is king, baby.