Saturday, October 15, 2016

Fake News Taints Facebook's Trending Topics

Facebook's Trending Topics section recently has carried a number of trending stories that were either "indisputably fake" or "profoundly inaccurate," The Washington Post reported this week.

The news feed six weeks ago ran a false story claiming Fox News had fired anchor Megyn Kelly for being a closet liberal who supported Hillary Clinton. Facebook removed the story, apologized, and promised to do better.

It appears that despite that commitment, the Trending Topics section is not yet problem free.

Trending Tall Tales

In an experiment conducted over several weeks following Facebook's promotion of the fake Megyn Kelly story, the Post recorded which topics were trending for it every day, on the hour, across four accounts.

That turned up five trending stories that were "indisputably fake" and three that were "profoundly inaccurate," Caitlyn Dewey reported.

There's no way to know whether those were the only false or highly inaccurate articles that made the Trending Topics feed during the experiment's run.

"If anything, we've underestimated how often" Facebook trends fake news, Dewey wrote.

Further, news releases, blog posts from sites such as Medium, and links to online stores such as iTunes regularly trended, the experiment revealed.

"The issue which has long bedeviled journalism is speed versus accuracy," noted David Abrahamson, a professor of journalism at Northwestern University's Medill School of Journalism.

"In the brave new social media world, speed is everything, and veracity seems to not be regarded as too important," he told TechNewsWorld.

In the Fake News Pot

On Aug. 31, a story about an administrator at Clemson University kicking a praying man off campus trended, The Washington Post noted. The university debunked that story.

On Sept. 8, Facebook promoted "a breathless account" of the iPhone's "new and literally magical features," sourced from the real news site Firstpost's satirical Faking News Page.

On Sept. 9, a story claiming the Sept. 11 attacks were a controlled demolition trended.

Several days later, Facebook promoted a story about the Buffalo Bills from the satirical site SportsPickle.

Facebook's Responsibility

Facebook's role in distributing news and information is unclear.

The pivotal issue is whether Facebook is a common carrier, suggested Michael Jude, a program manager at Stratecast/Frost & Sullivan.

Facebook will have to ensure the stories it carries are factual only "if they represent themselves as an objective news site, which they don't," he told TechNewsWorld.

The company "makes it very clear that there are things they'll decide shouldn't be carried and that they'll take off their site," Jude said. "Likewise, they don't have to ensure what they carry is accurate. They haven't guaranteed that they'd be objective."

On the other hand, Facebook should be held to the same standards as other news organizations, given that an increasing number of people are getting their news from its site, contended Medill's Abrahamson.

"But who will judge when the number of eyeballs is the holy grail?" he asked.

Humans vs. Algorithms

Human editors in Facebook's Trending Topics department recently came under fire for applying an anticonservative bias to the feed's content. Facebook denied the allegations, but also took some steps to reassure critics, replacing its human editorial team with a process that relied on algorithms.

"When you take human judgment out of the loop, even though it's flawed and can be biased, you can't guarantee the veracity of any of the sources," Frost's Jude remarked. "That's why newspapers traditionally had editorial boards whose members had a wide range of philosophies and political persuasions."

Further, people are better than machines at adapting to situations in which others are trying to game them, suggested Rob Enderle, principal analyst at the Enderle Group.

To make adjustments in algorithms, systems "have to be rewritten," he told TechNewsWorld.

Machines "can't inherently learn yet that they're being tricked," Enderle pointed out.

"No one, to my knowledge, has ever deeply studied the accuracy of human editors," Medill's Abrahamson said, "but they do traditionally take their evaluative function seriously, which Facebook News apparently does not."

Possible Fixes for Facebook

Facebook should "bring back people until they can apply deep learning to their automated solution or otherwise make it far harder to trick," said Enderle.

Deep learning "could catch fake sites and those running malware -- and, based on user behavior, could downrate sites that are likely fake," he added. It also could "scan sites like Snopes to identify recurring fake stories early."

The first thing Facebook should do is care, declared Medill's Abrahamson. "Instead, the firehose meme seems to apply."


,

No comments:

Post a Comment