The Washington Post’s The Intersect ran a little experiment on Facebook Trending Topics, a functionality only available in certain English speaking countries. The experiment consisted of checking this section each hour on the hour, and recording which topics were trending on the platform. The results were published here.

The Intersect logged every news story that trended across four accounts during working days from 31st August to 22nd September. During that time, they uncovered five trending stories that were indisputably fake and three that were profoundly inaccurate. On top of that, blog posts from sites such as Medium and links to online stores such as iTunes were also regularly trending.

Although these results shouldn’t be taken as conclusive, as Facebook personalises its trends to each user, the observation that Facebook periodically trends fake news still stands. Facebook’s Trending feature was supposed to serve as a snapshot of the day’s most important and most-discussed news, made possible by a combination of algorithms and a team of editors; one algorithm found unusually popular topics, a human examined and vetted them, and another algorithm surfaced the approved stories for the people who would be most interested.

However, last May Facebook faced a torrent of high-profile accusations about political bias on the Trending editorial team. In the aftermath, the company decided to tweak the role humans play in approving Trending topics, and by August Facebook had laid off its editorial team and given the engineers who replaced them orders to make more robust algorithms.

Under the earlier system, editors were told to independently verify trending topics surfaced by the algorithm, even by cross-referencing “Google News and other news sources”. But the engineers were told to accept every trending topic linked to three or more recent articles, from any source, or linked to any article with at least five related posts. The problem is also that many Facebook pages may have a flock of fake followers sharing and liking the news post, making it seem like it is genuinely trending.

Although the review guidelines appear largely to blame, Facebook hasn’t indicated any plans to change them. Rather, the social network maintains that its fake news problem can be solved by better and more robust algorithms, which might be something of a challenge because, with so many brands and publications buying bots to increase visibility and engagement, there is still no way of knowing how many real accounts Facebook actually has.

Read more about the experiment here.