If you're a frequent reader of this site, you might remember how in August we covered the baffling McChicken trending video. That video, an extremely NSFW clip of a man masturbating with a chicken sandwich, made it onto the Facebook feeds of millions — all thanks to the platform's new method for highlighting trending news stories.
It's no coincidence that this video went viral right after Facebook fired its entire editorial team. Before August 26th, news stories that appeared in users' feeds were curated by a team of human editors. On that date, Facebook summarily and unceremoniously fired all of them.
At the time, sites like the Guardian pounced on Facebook's decision, which quickly led to fake stories (like one reporting Megyn Kelly had been fired from Fox News), disgusting stories (like the McChicken video), and fake, disgusting stories (like 9/11 conspiracy theories) reaching user feeds.
Facebook apologized and promised it was "working to make our detection of hoax and satirical stories quicker and more accurate," according to a statement from VP of global operations Justin Osofsky made to CBS News.
But, according to an investigation by the Washington Post, Facebook has failed to keep its promise.
The investigation, run by WaPo's Intersect team, uncovered a number of misleading, inaccurate, and downright fake stories being promoted to the Facebook feeds of Intersect members. "On top of that," writes the Intersect team, "we found that news releases, blog posts from sites such as Medium and links to online stores such as iTunes regularly trended."
The reason for all this is simple: Facebook's trending algorithm simply doesn't do as good a job as human editors do. According to the Post's report, human editors at Facebook were instructed to independently verify topics that the algorithm brought to their attention before pushing them to feeds. Once the editors were fired, engineers "were told to accept every trending topic linked to three or more recent articles, from any source, or linked to any article with at least five related posts."
In other words, no one's at the wheel, and the system is easy to game. The fact that Facebook has until now avoided a full-on catastrophe of users falling for a potentially harmful fake news story has more to do with luck than anything else.
For what it's worth, Facebook VP of product management Adam Mosseri has claimed that Facebook plans to add filtering tech to the Trending algorithm that "guesses" about content being fake or satirical in the same way that users' main feeds do. But based on the Post's report, the technology clearly isn't there yet.
For more stories like this, follow @WhatsTrending on Twitter.