It’s time to stop child exploitation on YouTube

Gabriella Pinos/ Assistant Entertainment Director 

Even on the world’s largest video streaming website, kids are unsafe.

On one hand, YouTube has given children a platform to express themselves and share their creativity to the world, whether it be through vlogs, skits, talk shows or just by being themselves.

But that freedom of expression is being exploited by sick, perverted individuals who manipulate the website for their own sinister purposes. The worst part: YouTube has encouraged them to do so.

On Sunday, Feb. 17, a viral video by YouTuber Matt Watson showcased how the platform’s algorithm allows users – even with new accounts – to encounter inappropriate comments under videos of young children. While some of these videos were uploaded from the kids themselves or their parents, others were reuploaded to the site by pedophiles, according to Watson.

The comment sections for these videos have some of the most repulsive content on the site – sexual and predatory language, timestamps which mark when these children are in compromising positions and even links to similar videos or WhatsApp group chats where these pedophiles interact with each other.

YouTube’s algorithm seems to recommend these kinds of videos once the user has watched as few as two of them, hurling them into a “wormhole” of “soft-core pedophilia.”

It’s sickening, to say the least, that the system YouTube was built on can unintentionally encourage predatory behavior towards minors. What’s worse is that it’s not new to the platform.

In 2017, videos featuring adults dressed as cartoon characters such as Elsa and Spider-Man stirred controversy, as they often contained sexual or inappropriate scenarios. The controversy, called Elsagate, entered the mainstream media as news outlets reported on the pedophilic behavior within YouTube communities.

On Monday. Feb. 25, 2019, a Florida mom discovered clips inserted into children’s videos giving instructions on how to commit suicide and acts of self-harm.

The clips in question originated from a YouTuber called Filthy Frank, whose channel was notorious for edgy, ironic humor targeted at adults. Watching some of his past videos, it’s clear that this content is not meant to be taken seriously and certainly not meant to be viewed by children.

But through the eyes of a child, this content isn’t humorous – it’s encouraging self-harm. What’s ridiculous is that these clips somehow managed to pass through the filters on YouTube Kids, which is meant to be a kid-friendly version of the platform. 

These events, especially Watson’s video, have pushed companies such as AT&T and Disney to pull advertisements from YouTube altogether. The move, some creators fear, will cause another “adpocalypse” on the platform, which will prevent many channels from receiving monetization or advertisement revenue.

While it’s great that advertisers are finally removing their ads from these videos, it’s only putting a Band-Aid over the gaping wound that is YouTube’s faulty algorithm. 

Pedophilic content has thrived on YouTube thanks to its recommendation system; it’s terrifying to think how long this behavior has gone on for and how many children have been scarred or victimized by predatory and harmful content.

It’s understandable that with the number of videos uploaded daily to the site, some inappropriate content will fall through the cracks. But it all raises questions about how serious YouTube is in regulating its website. 

At least now – and only just now – is this issue being taken seriously.

After the Elsagate fiasco in 2017, the platform published tougher rules for family-friendly videos to remove these videos from the site. Since the Watson video went viral, YouTube has disabled comments on videos with minors, removed inappropriate comments and deleted channels that exhibited predatory behavior, according to a blog post published by YouTube on Thursday, Feb. 28.

Yes, some of this feels a little too late, but YouTube seems to be listening to creators and viewers this time around with the new steps they are integrating into their site, something they should have been doing more seriously from the start. The harm has already been done; the only way to prevent it now is to continue pushing for changes in the platform’s regulations and report these inappropriate videos if or when they are recommended.

In the words of Filthy Frank, “it’s time to stop.”

DISCLAIMER:

The opinions presented within this page do not represent the views of PantherNOW Editorial Board. These views are separate from editorials and reflect individual perspectives of contributing writers and/or members of the University community.

Photo by Christopher Gower on Unsplash

Be the first to comment on "It’s time to stop child exploitation on YouTube"

Leave a comment

Your email address will not be published.


*