YouTube as a radicalizer

YouTube has been accused of ‘radicalizing’ users by gradually recommending more radical videos (e.g., recommending ultramarathon videos for running videos or much more politically extreme videos for political videos).

This idea was well captured by YouTube, the Great Radicalizer by Zeynep Tufekci. And the guardian article How an ex-YouTube insider investigated its secret algorithm.

Ribeiro2021auditing investigates this claim, showing some evidence supporting it. Ledwich2019algorithmic argues that YouTube is not a radicalizer, but promotes mainstream media rather than extreme content. Haroon2022YouTube is another paper that shows some evidence that youtube funnels partisan content (especially for right-wing users). They created many sockpuppets and make them watch partisan videos and examined what kinds of recommendations they got. They showed that the puppets are exposed to increasingly more ideologically biased as they follow the recommendation trail.

These studies study what kinds of videos are recommended. However, more exposure may not necessarily lead to ideological shift or stance changes. Liu2023algorithmic performs a naturalistic experiment by creating a youtube-like interface and recommending actual videos from YouTube. They report that a large perturbation in recommendation shows only limited causal effects on policy attitudes.