YouTube recommendations lead young children to videos about school shootings and other gun-related content, according to a new report. According to the Tech Transparency Project (TTP), a nonprofit monitoring group, YouTube’s recommendation algorithm “nudges boys interested in video games to scenes of school shootings, instructions on how to use and modify weapons” and other gun-focused content.
The researchers responsible for the report created four new YouTube accounts, posing as two 9-year-old boys and two 14-year-old boys. All accounts viewed playlists of content about popular video games, such as RobloxAnd Lego star warsAnd Hello And grand theft auto. The researchers then tracked the accounts’ recommendations over a 30-day period last November.
βThe study found that YouTube pushed shooter and weapon content to all gamer accounts, but at a much higher volume for users who clicked on recommended videos on YouTube,β TTP wrote. These videos included scenes depicting school shootings and other mass shootings; graphic displays showing how much damage guns can do to the human body; and instructional guides for converting a handgun into a fully automatic weapon. “
As the report notes, many of the recommended videos appear to violate YouTube’s own policies. Recommendations included videos of a girl shooting a gun, lessons on converting handguns into “fully automatic” weapons, and other modifications. Some of these videos have also been monetized with ads.
In a statement, a YouTube spokesperson pointed to the YouTube Kids app and its in-app tools, which “create a safer experience for tweens and teens” on its platform.
“We welcome research around our recommendations, and are exploring more ways to bring in academic researchers to study our systems,” said the company spokesperson. But when reviewing the methodology of this report, it is difficult for us to draw strong conclusions. For example, the study does not provide context for the total number of videos that were recommended for test accounts, nor does it provide insight into how test accounts were set up, including whether YouTube’s moderated experiments tools were implemented.
The TTP report is far from the first time researchers have asked questions about YouTube’s recommendation algorithm. The company has also spent years working to reduce so-called content β videos that don’t strictly break its rules but might be inappropriate for mass distribution β from appearing in recommendations. And last year, the company said it was considering sharing some of that content altogether.