Video Get App
Music Movies TV Comedy Games Books Drink Politics Travel Tech

YouTube Is Finally Regulating the Conspiratorial Garbage Their Platform Is Famous For

Politics Features YouTube

YouTube is likely the biggest source of radicalization in the Western world, as its algorithm has a way of recommending conspiratorial content whether you want it or not. Here is MSNBC’s Chris Hayes walking you through a standard YouTube rabbit hole.

You watch videos on YouTube all the time, so you go home and put "Federal Reserve" into YouTube's search bar.

This is the first video that comes up (1.6 million views)— Chris Hayes (@chrislhayes) September 6, 2018

It's a John Birch Society lecture delivered by someone who says he "infiltrated Marxist organizations as a young man" and that Marxists infiltrate the media.— Chris Hayes (@chrislhayes) September 6, 2018

Once you've viewed that, you can then click on this video, conveniently furnished by the algorithm:

Trump Tells Everyone Exactly Who Created Illuminati (4 million views)— Chris Hayes (@chrislhayes) September 6, 2018

This is how YouTube makes money. Speaking as someone who tried to start their own social network centered around an algorithm, it's clear as day what is (was?) happening on YouTube. Their most ardent users are conspiracy theorists, the algorithm prioritizes videos that get a lot of engagement and keep people on the site as long as possible, so the algorithm assumes that when someone watches a popular cooking tutorial, the next thing he or she wants to watch is a 20-minute diatribe with 14 million views on how lizard people really run the world. This weekend, YouTube announced that they will finally do something to stem the constant waterfall of toxic sludge they're pumping into the political discourse every day:

YouTube has announced that it will no longer recommend videos that “come close to” violating its community guidelines, such as conspiracy or medically inaccurate videos.

Former Google engineer Guillame Chaslot told a very personal story as to how Google's algorithm (used to?) work, and how it is specifically designed to radicalize people.

Brian is my best friend's in-law. After his dad died in a motorcycle accident, he became depressed. He fell down the rabbit hole of YouTube conspiracy theories, with flat earth, aliens & co. Now he does not trust anyone. He stopped working, seeing friends, and wanting kids. 2/— Guillaume Chaslot (@gchaslot) February 9, 2019

We designed YT's AI to increase the time people spend online, because it leads to more ads. The AI considers Brian as a model that should be reproduced. It takes note of every single video he watches & uses that signal to recommend it to more people 4/— Guillaume Chaslot (@gchaslot) February 9, 2019

Brian's hyper-engagement slowly biases YouTube:

1/ People who spend their lives on YT affect recommendations more
2/ So the content they watch gets more views
3/ Then youtubers notice and create more of it
4/ And people spend even more time on that content. And back at 1

6/— Guillaume Chaslot (@gchaslot) February 9, 2019

Example of YT vicious circle: two years ago I found out that many conspiracies were promoted by the AI much more than truth, for instance flat earth videos were promoted ~10x more than round earth ones

8/— Guillaume Chaslot (@gchaslot) February 9, 2019

The AI change will have a huge impact because affected channels have billions of views, overwhelmingly coming from recommendations. For instance the channel secureteam10 made half a billion views with deceiving claims promoted by the AI, such as: 12/— Guillaume Chaslot (@gchaslot) February 9, 2019

This AI change will save thousands from falling into such rabbit holes

(If it decreases between 1B and 10B views on such content, and if we assume one person falling for it each 100,000 views, it will prevent 10,000 to 100,000 "falls") 14/— Guillaume Chaslot (@gchaslot) February 9, 2019

Conclusion: YouTube's announcement is a great victory which will save thousands. It's only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable.

If you see something, say something.

16/— Guillaume Chaslot (@gchaslot) February 9, 2019

This is a big, big deal. Around 1.3 billion people use YouTube. For years, their algorithm eschewed any standards of basic humanity for a cash grab underwritten by conspiracy theories that have done immeasurable damage to democracy in the West. YouTube's algorithm was so hapless at identifying manipulative BS that Kremlin propaganda made it through its filters, and YouTube didn't fix it until NBC alerted them to their failure.

The absolutely insane QAnon conspiracy theory, which falsely asserts that Robert Mueller and Donald Trump are actually teaming up together to take down a worldwide pedophilia ring run by the Democrats and Hollywood, would have nowhere near as much reach today without YouTube. When QAnon named Tom Hanks as complicit in their fantasy, YouTube verified it as true.

This is what happens when you search Tom Hanks on YouTube today.

Last week, Qanon folks decided he was a pedophile. If you were to search YouTube today, you'd believe it.— Ben Collins (@oneunderscore__) July 30, 2018

Same with Steven Spielberg.

Add Steven Spielberg to the group of celebrities whose search results on YouTube prioritize baseless pedophilia accusations first.

Three of the top five results are QAnon conspiracy theorists calling him a pedophile. Pandemic levels of bullshit unchecked on YouTube today.— Ben Collins (@oneunderscore__) July 30, 2018

What YouTube is doing here is admitting that they are an editorial company. Silicon Valley has fought this distinction tooth and nail in favor of a “we are not responsible for what people put on our platform” approach. The problem is that as your platform gets larger and larger, it becomes harder and harder to assert zero editorial guidance. The logical outcome of this kind of mindset is YouTube—pre-algorithm change—where you can watch a tutorial on how to make ravioli and then have YouTube suggest a Jordan Peterson video with ten million views where he argues that misogyny is necessary because of some minute trait in lobsters (you may think that I was being unnecessarily hyperbolic to lampoon the famed Canadian “philosopher,” but nope, he really argued this should be the case).

The central problem in this whole mess is that capitalism has no moral center. Strictly adhering to the whims of the market has led YouTube to be dominated by conspiracy theorists—to such an extreme degree where the recommendations coming out of the algorithm made it seem like conspiracy theories are YouTube’s product. Cracking down on this nonsense is an unequivocally good thing, and YouTube deserves some amount of credit for finally doing the bare minimum, but the fundamental truth of their platform is unchanged. By placing their fate in the whims of the market, without any serious thought whatsoever as to what kind of company they want to look like, YouTube has become the place on the web to host conspiracy theories, and it will take a lot more than just an algorithm tweak to change that.

Jacob Weindling is a staff writer for Paste politics. Follow him on Twitter at @Jakeweindling.

More from YouTube