Researchers posing as 9-year-old were flooded with gun-related content.
YouTube’s recommendations are leading young kids to videos about school shootings and other gun-related content, according to a new report. According to the Tech Transparency Project (TTP), a nonprofit watchdog group, YouTube’s recommendation algorithm is “pushing boys interested in video games to scenes of school shootings, instructions on how to use and modify weapons” and other gun-centric content.
The researchers behind the report set up four new YouTube accounts posing as two 9-year-old boys and two 14-year-old boys. All accounts watched playlists of content about popular video games, like Roblox, Lego Star Wars, Halo and Grand Theft Auto. The researchers then tracked the accounts’ recommendations during a 30-day period last November.
“The study found that YouTube pushed content on shootings and weapons to all of the gamer accounts, but at a much higher volume to the users who clicked on the YouTube-recommended videos,” the TTP writes. “These videos included scenes depicting school shootings and other mass shooting events; graphic demonstrations of how much damage guns can inflict on a human body; and how-to guides for converting a handgun to a fully automatic weapon.”
As the report notes, several of the recommended videos appeared to violate YouTube’s own policies. Recommendations included videos of a young girl firing a gun and tutorials on converting handguns into “fully automatic” weapons and other modifications. Some of these videos were also monetized with ads.
In a statement, a YouTube spokesperson pointed to the YouTube Kids app and its in-app supervision tools, which “create a safer experience for tweens and teens” on its platform.
“We welcome research on our recommendations, and we’re exploring more ways to bring in academic researchers to study our systems,” the spokesperson said. “But in reviewing this report’s methodology, it’s difficult for us to draw strong conclusions. For example, the study doesn’t provide context of how many overall videos were recommended to the test accounts, and also doesn’t give insight into how the test accounts were set up, including whether YouTube’s Supervised Experiences tools were applied.”
The TTP report is far from the first time researchers have raised questions about YouTube’s recommendation algorithm. The company has also spent years working to reduce so-called “borderline” content — videos that don’t break its rules outright but may otherwise be unsuitable for mass distribution — from appearing in recommendations. And last year, the company said it was considering disabling sharing altogether on some such content.