In order to alleviate its recent child exploitation controversy, YouTube is “aggressively approaching” a solution by imposing restrictions, increasing moderation, and addressing recommendation search issues.
As a part of the solution, YouTube is limiting which videos some ads can run on and is applying those restrictions to millions of videos, according to AdWeek, citing a memo that was sent to advertisers. In the memo, YouTube has described how it is trying to protect children as major companies including Disney, Epic Games, AT&T, Nestle have pulled their ads from the platform. This comes as a response to an ongoing campaign which looks to bring attention to the videos used by predators to exploit children.
The campaign, which is lead by creator Matt Watson, points specifically to the fact that advertisers are running their commercials on the videos.
YouTube also says that it may ask creators to moderate their comments more rigorously. The memo has YouTube admit that it is responsible for the comment appearing on its site and states that it will “hold monetizing channel owners to a higher standard.”
A YouTube spokesperson said that the platform has taken more continuous, aggressive steps to fight this type of content on its platform including hiring social workers, child development specialists, former prosecutors, and former FBI and CIA employees. The spokesperson also added that YouTube also removes accounts per week run by children under the age of 13.
One key change that YouTube worked on as the campaign took effect was to addressing its recommendation algorithm. Watson showed that certain terms like ‘Bikini haul’ led the algorithm to suggest videos within five clicks away that contained predatory contents. According to a spokesperson, YouTube has recognized that autocomplete suggestion may have may have increased the likelihood that someone would come across that content. YouTube’s recommendation algorithm has long been an issue for the platform in a variety of issues including surfacing conspiracy theories and surfacing hateful content.
“Once they were made aware of the offending content, they handled the situation,” commentator Philip DeFranco said. “The best thing we can do is report disgusting monsters as we would anywhere else on the internet.”
Many YouTube creators are concerned that this might trigger a new adpocalypse–a word used to describe when YouTube heavily restrict ads. “I’m not reporting the story because it negatively affects the whole YouTube community,” Daniel “Keemstar” Keem, the host of the popular show DramaAlert, tweeted earlier this week. “We don’t need another ad apocalypse.”
The latest concern started with a Reddit post on r/drama, and a YouTube video exposing a “wormhole into a soft-core pedophilia ring on YouTube,” according to Matt Watson. Watson, a former YouTube creator who returned with a single live stream video about the topic, showed how a search term like ‘bikini haul’ can lead to exploitative videos of children. The videos are not pornographic in nature but the comment sections are full of people time stamping parts of the video where the children featured in the video are in a compromising position.
“Youtube’s recommended algorithm is facilitating pedophiles’ ability to connect with each other, trade contact info, and link to actual child pornography in the comments,” Watson wrote on Reddit. “I can consistently get access to it from vanilla, never-before-used YouTube accounts via innocuous videos in less than ten minutes, in sometimes less than five clicks.”
A statement from a YouTube spokesperson said that YouTube has taken down a few accounts that were featured in Watson’s video.
“Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” the spokesperson said. “We enforce these policies aggressively, reporting it to the relevant authorities, removing it from our platform and terminating accounts. We continue to invest heavily in technology, teams, and partnerships with charities to tackle this issue.”