YouTube has already employed a number of strategies to suppress content its owners have branded “conspiratorial”, “hateful” or otherwise contrary to the company’s Silicon Valley value system – including demonetizing videos and deplatforming controversial content creators like InfoWars. Yet, criticisms about YouTube being a breeding ground for these non-mainstream, Overton-window-expanding ideas have persisted. Now, the company is trying something that could potentially harm the company’s treasured view-hours metric: Tweaking its algorithm to stop users from being ushered into conspiratorial “rabbit holes” on everything from flat earthers, to the 9/11 truther movement, to purveyors of miracle cures.
According to the Daily Beast, the streaming video website is tweaking its recommendations algorithm to overlook content that “comes close” to violating – but doesn’t explicitly violate – YouTube’s “community guidelines.” The company estimates that this will impact less than 1% of all videos posted on the site.
To be clear, these videos will still appear on YouTube, and they can still be displayed in search results. The only thing that will change, according to the company, is their placement in the recommendations bar or queue.
This could be a huge blow to Flat Earthers and others, who count YouTube as their biggest recruitment tool. But then again, that’s the whole point: YouTube said the policy does an adequate job of balancing free speech with the public interest.
Currently, conspiracy theories like Flat Earth count YouTube as one of their largest recruitment tools. At the second annual Flat Earth International Conference in November, most participants told The Daily Beast they’d converted to Flat Earth belief after watching YouTube videos on the topic. Some said they’d started watching videos on conspiracies like 9/11, and eventually saw Flat Earth videos recommended in their YouTube feeds; others said they went looking for Flat Earth videos, and were recommended a stream of new Flat Earth videos.
Guillaume Chaslot, a former YouTube employee who worked on the site’s recommendation algorithm in 2010 previously told The Daily Beast that the algorithm can push people down conspiratorial rabbit holes.
“I realized really fast that YouTube’s recommendation was putting people into filter bubbles,” Chaslot said last year. “There was no way out. If a person was into Flat Earth conspiracies,