How YouTube is quietly radicalizing the next generation — and making millions

Get the Think newsletter.

In August, as neo-Nazis rioted in Chemnitz, Germany, YouTube users were recommended videos from extremist sources blaming the riots on refugees. YouTube was also castigated by researchers at the Counter Extremism Project and by MSNBC host Chris Hayes for broadcasting terrorist propaganda and conspiracy videos. Hayes recently posted a Twitter thread of horrific results YouTube returned for a search on the Federal Reserve. Elsewhere, journalists and researchers have documented women YouTubers promoting white supremacy and YouTube stunts promoting extremist politics in a media environment similar to right-wing talk radio.

Facebook’s Mark Zuckerberg Sheryl Sandberg and Twitter CEO Jack Dorsey have been called to testify before Congress in high-profile hearings that questioned their products, politics and allegiances. And yet YouTube has remained conspicuously absent from these calls for platform accountability.

This is a mistake. YouTube is a principal online news source for young people. According to Pew Research Center, 73 percent of U.S. adults use YouTube, and 94 percent of 18- to 24-year-olds do. Another Pew study found that the platform was second only to Facebook as the most popular social network for viewing news stories. At the same time, YouTube — a subsidiary of Google — has a business model that helps amplify and propagate extremism through social networks.

YouTube — a subsidiary of Google — has a business model that helps amplify and propagate extremism through social networks.

Over the past year, I have researched what I and my Data & Society Research Institute colleague Joan Donovan term the Alternative Influence Network on YouTube. I found that YouTube provides a venue for an alternative media universe comprising a hybrid counterculture of entertainers, journalists, and commentators discussing similar themes, criticizing mainstream news outlets and interacting as guests on each other’s channels. Within this media system, many influencers — intentionally or unintentionally — destabilize the world-views of young people by turning them against the mainstream media. Some influencers encourage their viewers to openly embrace racism, misogyny and white nationalism.

To be clear, YouTube’s glut of white-supremacist content isn’t simply a glitch in the platform’s content-delivery system: It’s the product of a social problem badly exacerbated by technology and which extremists have exploited to amplify their messages as widely as possible. The company’s decision to ban Infowars founder Alex Jones’s channel only addresses the biggest animal in the herd. Multitudes of other extremist channels remain, including white nationalist and reactionary influencers. Many lesser-known channels broadcast to hundreds of thousands of followers on a platform that profits from pushing their message, unmoderated, unedited and unabated. YouTube relies on viewers to flag content that exceeds the platform’s community standards — and those who watch extremist influencers rarely do that.

YouTube’s glut of white-supremacist content isn’t simply a glitch in the platform’s content-delivery system: It’s the product of a social problem badly exacerbated by technology and which extremists have exploited.

Instead, YouTube audiences are at risk of radicalization by influential figures who gain viewer trust through long-term, one-sided relationships with audiences. Influencers broadcast the minutiae of their lives, react to their viewers in real time and talk to them through interactive chat features — on channels that can net thousands of dollars in a single broadcast. Like other YouTubers, members of this alternative influence network strive to make their content lighthearted and fun, with what they present as a sensible message of rebellion. And this sometimes leads new viewers gently toward the viciously racist, sexist and homophobic content that maximizes viewership and donations.

Zeynep Tufekci, an associate professor at the UNC School of Information and Library Science, has argued YouTube’s recommendation algorithm makes it easy to expose viewers to incrementally more extremist content. We argue that the same holds true for content collaborations: Content creators work together to reach new audiences, exploiting a mix of search engine optimization and media hype. Comparatively moderate influencers appear alongside or sequentially with those espousing more openly extremist views, which the more mainstream figures in turn help normalize.

For example, members of the so-called “Intellectual Dark Web” have appeared on the YouTube talk show hosted by Stefan Molyneux (and he has appeared on theirs), who claims black people are genetically inferior to whites. This influencer cross-pollination is a technique usually associated with commercial brands — but in this case, it’s selling extremism.

Extremism on YouTube is interwoven with the platform itself: The engine for radicalism particular to YouTube is its monetization of extreme ideas, allowing both extremists and the company to profit from such dangerous content. As such, the problem requires more profound and complex solutions than banning a few accounts or altering algorithms. But at minimum, YouTube itself should no longer slip under the radar of public scrutiny.