Facebook on Monday highlighted a hoax in its trending news section — that Fox News anchor Megyn Kelly had been fired for being a “traitor.”

Fox News Channel anchors and debate moderators (L-R) Chris Wallace, Megyn Kelly and Bret Baier begin the debate held by Fox News for the top 2016 U.S. Republican presidential candidates in Des Moines, Iowa January 28, 2015. Photo by Carlos Barria  via Reuters
Fox News Channel anchors and debate moderators (L-R) Chris Wallace, Megyn Kelly and Bret Baier begin the debate held by Fox News for the top 2016 U.S. Republican presidential candidates in Des Moines, Iowa January 28, 2015. Photo by Carlos Barria via Reuters
Slate.com called out Facebook for promoting a story that “racked up thousands of … shares and was likely viewed by millions before Facebook removed it for inaccuracy.”

How did this social-media monolith mess up?

“The blunder came just three days after Facebook fired the entire New York–based team of contractors that had been curating and editing the trending news section, as Quartz first reported on Friday and Slate has confirmed,” Slate said. “That same day, Facebook announced an ‘update’ to its trending section — a feature that highlights news topics popular on the site — that would make it ‘more automated.'”

Slate said Facebook replaced the New York–based team of contractors, “most of whom were professional journalists,” with a new team of overseers.

“Apparently it was this new team that failed to realize the Kelly story was bogus when Facebook’s trending algorithm suggested it.”

A company spokeswoman said: “We’re working to make our detection of hoax and satirical stories more accurate as part of our continued effort to make the product better.”

The American Press Institute added: “Trending topics will appear as either a short phrase or single word with a number representing how many people are talking about the topic on Facebook.

“But removing humans from trending topics doesn’t necessarily remove bias, Quartz notes: ‘Facebook’s primary reason for hiring human curators appeared to be to train their algorithms in what was newsworthy — and so it’s very likely their human biases were recorded and potentially amplified by the AI.’”

Facebook took it on the chin:

Leave a comment

Your email address will not be published. Required fields are marked *