The company banned more than 400 accounts and took down dozens of videos that put children at risk.
But even though YouTube addressed this particular controversy, critics of the company say they’re fed up that problems with child safety keep arising in the first place. For example, two years ago, YouTube faced a backlash after disturbing videos got past filters on YouTube Kids, a version of the service designed for children.
“This has been happening for years,” Haley Halverson, vice president of advocacy and outreach at the National Center on Sexual Exploitation, said in an email. “Why isn’t it YouTube’s No. 1 priority to create sustained solutions, instead of carrying on with its current whack-a-mole approach?”
The latest incident began on Sunday, when a video blogger named Matt Watson detailed how pedophiles could enter a “wormhole” of YouTube videos to see footage of children in sexually suggestive positions. In the comments of those videos, users would post time stamps linking to other videos, and YouTube’s algorithms would recommend even more of those kinds of videos.
In response, advertisers including AT&T and Epic Games, maker of Fortnite, pulled ad spending from YouTube.
YouTube declined to make an executive available for an interview, but in a statement a spokeswoman said: “Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube … There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
But child advocacy groups say the company isn’t working fast enough.