YouTube recommendations for 'alt-right' videos have dropped dramatically, study shows – CNET

Youtube logo is seen on an android mobile phone

SOPA Images

Google has made “major changes” to its recommendations system on YouTube that have reduced the amount of “alt-right” videos recommended to users, according to a study led by Nicolas Suzor, an associate professor at Queensland University of Technology.

During the first two weeks of February, alt-right videos appeared in YouTube’s “Up Next” recommendations sidebar 7.8 percent of the time (roughly one in 13). From Feb. 15 onward, that number dropped to 0.4 percent (roughly one in 250).

Suzor’s study took random samples of 3.6 million videos, and used 81 channels listed on a recent study by Rebecca Lewis as a starting point. That list includes voices like Richard Spencer, an American white supremacist, but also includes more mainstream voices like Joe Rogan, who does not self-identify as alt-right but often plays host to more extremist voices on his podcast (including alt-right figures such as Alex Jones).

The drop appears significant, but it’s difficult to figure precisely how that drop occurred. We don’t know if YouTube is targeting ‘alt-right’ videos specifically or if the drop off is part of broader changes to YouTube’s recommendation system.

YouTube has long spoken about making changes to recommendations. As early as two weeks ago, YouTube was criticised for allowing the flat-Earth movement to flourish using its platform.

In response, YouTube has attempted to curtail what it refers to as false information. YouTube says freedom of speech is central to its core tenets, even when people express controversial beliefs, but has been working to reduce the spread of misinformation on its platform.

YouTube has also been investing in surfacing credible voices on its platform. “In the last year alone,” said one recent YouTube blog post, “we’ve made hundreds of changes to improve the quality of recommendations for users on YouTube.”

In that same blog post, YouTube said it was planning to reduce “recommendations of borderline content and content that could misinform users in harmful ways — such as videos promoting a phony miracle cure for a serious illness, claiming the Earth is flat, or making blatantly false claims about historic events like 9/11.”

YouTube stated it would be a gradual change, and apply to less than 1 percent of the content uploaded on YouTube.

In a statement sent to CNET, a YouTube spokesperson said the platform wasn’t targeting alt-right videos.

“We announced in January that we are reducing recommendations of borderline content or videos that could misinform users in harmful ways. We have not had a chance to thoroughly review this study, however, our recommendation systems are not designed to favor or demote specific misinformation based on specific political perspectives.”

Speaking to CNET, Nicolas Suzor is looking for more transparency from YouTube regarding how and why certain videos are recommended.

“It’s not good enough that we have to guess about how well these systems are working,” he said, “and our research can only observe from the outside. YouTube has done a lot to improve transparency about its terms of service enforcement on a high level over the last year, but they still need to do more to help people understand how their algorithms are operating.”

source: cnet.com