NEWS

HomeServicesInvestingFacebook's Quest To Stop Fake News Risks Becoming Slippery Slope

Facebook’s Quest To Stop Fake News Risks Becoming Slippery Slope

Last week, a story  claiming that Ford Motor Co. was moving truck production from Mexico to Ohio went viral on Facebook. “The Trump Effect: It’s Happening Already!!” the Facebook user Right Wing News wrote. That story was actually based on a CNN report from 2015, before Donald Trump was even the Republican nominee for president.

The post was neither entirely true nor completely false. It fell into a gray area in the nuanced world of fact-checking, highlighting the thorny challenge of cracking down on fake news. While some articles are obviously fake, like one about the Pope endorsing Trump, many others are misleading, exaggerated or distorted, but contain a kernel of truth. They require judgment calls, and it can be hard to tell where to draw the line, professional fact-checkers say.

“It is a very slippery slope,” said Eugene Kiely, the director of FactCheck.org, a nonprofit that aims to reduce the level of deception and confusion in U.S. politics. “There’s bad information out there that’s not necessarily fake. It’s never as clear-cut as you think.”

Facebook is taking steps to address its role in spreading fake news, such as enlisting the help of third-party fact-checkers, Chief Executive Officer Mark Zuckerberg said Friday in a post. The social network was widely criticized for allowing false stories to circulate in the run-up to the U.S. presidential election, potentially influencing its outcome. Zuckerberg underscored the delicate balance his company must strike, saying “we need to be careful not to discourage sharing of opinions or mistakenly restricting accurate content.”

“We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties,” he wrote.

Yet professional fact-checkers say Facebook must not punish articles that are partially accurate. They say their jobs, like the truth, can be complicated, which is why they grade stories on a scale. For example, Snopes.com called the Ford Ohio story “ mostly false,” and labels others “unproven” or a “mixture” of true and false. The fact-checking website PolitiFact uses labels like “true,” “half true,” or “pants on fire.” Facebook’s algorithm may not understand the various shades of falsehood.

“It’s easy to see how an algorithm-only solution to fake news could result in blocking stuff that’s not false or is misleading for reasons that are partisan but not inaccurate,” said Alexios Mantzarlis, who leads Poynter’s International Fact-Checking Network.

One article that circulated widely on Facebook before the election claimed Bill and Hillary Clinton had stolen White House furniture. The allegation actually dates back to when the Clintons left the White House. They returned many pieces of furniture and paid the government back for some gifts, according to PolitiFact, which concluded a version of the story contained “several inaccuracies” and was “mostly false,” but added “there is a grain of truth.”

“Did they steal furniture from the White House?” FactCheck.org’s Kiely said. “That’s a judgment call.”

Tweaking the Algorithm

In recent days, media critics and fact checkers have suggested a variety of ways that Facebook could address the problem of fake news. One solution: Facebook could tweak its algorithm to promote related articles from sites like FactCheck.org so they show up next to questionable stories on the same topic in the news feed. Last month, for example, Google announced that it would start labeling fact-checked articles in Google News results.

“If we’ve all looked at it and all agreed this is something that is false or misleading, there should be a way to push that up and bring that to the attention of the reader,” Kiely said.

Facebook should also make it easier for users to flag fake news and see which media company published the story, according to John Borthwick, chief executive officer of Betaworks, and Jeff Jarvis, the director of the Tow-Knight Center for Entrepreneurial Journalism at the City University of New York. In a posting on Medium.com, they suggested the media companies and social-media sites collaborate with each other to address the problem.

‘Overwhelming’ the Checkers

But even an army of fact-checkers may not be enough to police the deluge of dubious stories on Facebook. Snopes.com, for instance, gets as many as 300 e-mails an hour from internet users asking whether something they’ve read is true or not, Brooke Binkowski, managing editor at Snopes.com, said on Bloomberg’s Decrypted podcast. During the election, the amount of requests at FactCheck.org was “overwhelming,” Kiely said.

“We couldn’t keep up with it if we tried,” he said.

With so much fake news to debunk, Facebook could hire a team of fact-checkers to verify only the most popular articles, Mantzarlis said. Those people could investigate the top stories in Facebook’s “trending” section, for example. In doing so, Facebook could cripple the reach of repeat offenders and flag their posts as being deemed fake, he said.

“No one wants Facebook to deploy 1 million fact-checkers on every single post,” he said. “The problem is the stuff that surfaces all the way to the trending section. That’s a place where Facebook could make a large impact with relatively small commitment.”

This article was provided by Bloomberg News.
 

RELATED ARTICLES

Most Popular