From my new piece in The Washington Post:
YouTube is overshadowed by Facebook and Twitter in the debate over the harms of social media, but the site has massive reach — 3 in 4 Americans report using it. This growth has been driven by YouTube's use of algorithms to recommend more videos to watch, a feature that critics warn can lead people down rabbit holes of conspiracy theories and racism.
In 2018, for example, the sociologist Zeynep Tufekci described how YouTube started suggesting she check out "white supremacist rants, Holocaust denials and other disturbing content" after she started watching videos of Donald Trump rallies in 2016, prompting her to warn about the site "potentially helping to radicalize billions of people."
Google — YouTube's parent company — has sought to address these concerns. In 2019, for instance, it announced new efforts to remove objectionable content and reduce recommendations to "borderline" content that raises concerns without violating site policies.
Has YouTube done enough to curb harmful material on the platform? In a new report published by the Anti-Defamation League, my co-authors and I find that alarming levels of exposure to potentially harmful content continue. When we directly measured the browsing habits of a diverse national sample of 915 participants, we found that more than 9 percent viewed at least one YouTube video from a channel that has been identified as extremist or white supremacist; meanwhile, 22 percent viewed one or more videos from "alternative" channels, defined as non-extremist channels that serve as possible gateways to fringe ideas.
Comments