YouTube has a pedophilia problem, and its advertisers are jumping ship

YouTube has a pedophilia problem, and its advertisers are jumping ship

The video platform has major child safety issues. Will big-name advertisers force it to solve them?

For years, health professionals and childhood advocacy groups have been vocal about their concerns over child safety and YouTube. The company has taken measures to try to make YouTube a safe space for children and shield its young viewers from the dangers of the internet. Four years ago, for example, it launched a special app specifically for children’s content, YouTube Kids.

But despite these efforts, the problems have not gone away.

In a statement to Vox, YouTube says it “took immediate action by deleting accounts and channels” and that it will “continue to work to improve and catch abuse more quickly.”

Just days before the Wired story was published, a YouTube vlogger named Matt Watson said he discovered via an investigation of his own that the YouTube algorithm feeds people videos of children playing once they start looking for it — a “wormhole into a soft-core pedophilia ring,” as he terms it.

These are troubling findings — for parents who upload videos to YouTube of their kids playing, and for children who are increasingly abandoning television to hang out on YouTube instead. In some cases, children are even interacting with the commenters, according to Wired, responding to their questions and providing personal information like their age.

It also spells big trouble for advertisers. In the past week, brands like AT&T, Disney, Hasbro, Epic Games, and Nestle have pulled their ads from YouTube, saying they won’t work with the tech giant until it can figure out how to solve this problem.

Google has claimed it’s fixed the issue by banning some accounts and closing the comments sections on certain videos. But Haley Halverson, of the DC-based National Center on Sexual Exploitation, says it’s still very much ongoing.

“Within two clicks, I was able to enter into a rabbit hole of videos where children are being eroticized by pedophiles and child abusers,” Halverson wrote in a statement released Friday. “The content became more flagrantly sexualized the more I clicked, as the YouTube recommendation algorithm fed me more and more videos with hundreds of thousands, and sometimes millions, of views. Despite YouTube’s claims to be cleaning up this content, YouTube so far still continues to monetize videos that eroticize young children and that serve as hubs for pedophiles to network and trade information and links to more graphic child pornography.”

YouTube is now figuring out how it will deal with its issue of sexually exploitative comments, as advertisers abandon deals that help Google earn its billions. But like all issues involving tech giants, the solutions aren’t so simple.

How Youtube became an inadvertent home to a “soft” pedophile ring

YouTube was launched in 2005 by three former Paypal employees. The idea for creating an easily accessible video site came about after one of the founders had trouble locating a video of Janet Jackson’s wardrobe malfunction from the 2004 Super Bowl.

Google bought the site in 2006 for $1.6 billion, and today it is the second-most-visited website on the internet (behind Google.com). YouTube has 1.9 billion monthly users, which is about a third of the population of the entire internet.

YouTube today has turned into, among other things, a platform where personalities can make millions of dollars off weird DIY videos and beauty tutorials. It’s also home to plenty of funny videos, many of which involve children. One of the first viral videos was “Charlie bit my finger”; the clip of 3-year-old Harry having his finger chomped on by his 1-year-old brother has more than 867 million views. There was the video of then-7-year-old David after the dentist, who got high off nitrous in 2009, or a video of an adorable then-3-year-old girl named Cody, who cried because she couldn’t handle how much she loved Justin Bieber.

Like these viral videos, there are hundreds of millions of hours’ worth of content about kids on YouTube, most of which is innocent. But child advocacy groups have spoken for years about the dangers of YouTube — and of the internet in general — when it comes to childhood safety, and it’s now starting to bubble to the surface as it’s become clear that these videos are not always being viewed with innocuous intentions.

YouTube has clear rules against explicit content. In its outline of its nudity and sexual content policy, the company writes that “content meant to be sexually gratifying (like pornography) is not allowed on YouTube” and that “videos containing fetish content will be removed or age-restricted.” The company also says that “sexually explicit content featuring minors and content that sexually exploits minors” is not allowed. It says it reports content that contains imagery of child sexual abuse to the National Center for Missing and Exploited Children, which works with law enforcement.

But as the Wired story points out, these people aren’t necessarily coming to YouTube to look for pornographic content, but are interested in the more innocuous type, like videos where children’s private parts are shown, both covered and uncovered, while doing exercise or playing games. People are leaving suggestive comments on the videos (which are mostly of girls, some as young as 5) and are sharing the timestamp of when this content is spotted. Per YouTube vlogger Watson’s assessment, the YouTube comments section is also enabling these people to communicate with one another.

Child pornography “is being traded, as well as social media and WhatsApp addresses. YouTube is facilitating this problem,” Watson writes in the description of his video, explaining how these YouTube commenters are getting in touch with each other to share sexual content about children, making their network broader and more sprawling than just YouTube.

Because of YouTube’s algorithm, once viewers start watching videos of children playing and jumping, they are then fed videos that seem to be popular along the same lines; in this way, the video-sharing site is essentially feeding viewers the content they are looking for. In some cases, children on YouTube are even responding to commenters. Per Wired, “on one video, a young girl appears to ask another commenter why one of the videos had made him ‘grow’.”

YouTube has said that it’s “aggressively” tackling this problem. In an email to Vox, a spokesperson wrote:

Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly.

The company also says it’s purged 400 accounts responsible for uploading videos that appear to be exploiting children, and deleted millions of comments. YouTube told Vox it constantly tries to kick users younger than 13 off the platform, and that it’s been trying to hire more child safety experts, including former CIA and FBI employees.

Yet YouTube is still littered with hundreds of thousands of videos of children, and many of these still have problematic comments.

I’ve spent a few hours this week doing my own searches on the site. I found that many videos of children playing have had their comments sections disabled; I’ve also seen some videos taken down.

But there are still plenty of videos with innocent content that’s being exploited, like of girls playing in skirts. These videos still have comments with timestamps, as well as the ongoing exchange of personal information.

Another “adpocalypse” as advertisers are backing away from YouTube

The problem becomes even more morally disturbing when you consider that money is being made from this content. The clicks are being monetized by dozens of brands, as YouTube brings Google about $3.9 billion in advertising revenue every year, per Statista.

“The pedophile crisis, like all YouTube crises, is a direct result of the platform’s business model,” Josh Golin, the executive director of the advocacy group Campaign for a Commercial-Free Childhood, says. “It is extremely important to note that YouTube’s algorithm was not malfunctioning; by recommending more and more videos to pedophiles of girls in swimsuits or doing gymnastics, it was functioning exactly as intended: to keep users on the site as long as possible so YouTube could make more money.”

Nearly every big company you can name, from HBO to Peloton to L’Oréal to Samsung, has advertised on the video-sharing site. After these reports surfaced last week, many began pulling their advertisements from YouTube. A spokesperson for Epic Games, which owns Fortnite, told Wired that it’s paused advertising on YouTube and had “reached out to YouTube to determine actions they’ll take to eliminate this type of content from their service.” Disney, Hasbro, and Nestle have also pulled their ads from YouTube, as has AT&T.

This is not the first time YouTube has gotten into trouble with advertisers. In 2017, brands like Verizon, Johnson & Johnson, and AT&T pulled their advertisements from YouTube and Google after it was revealed that their ads were playing alongside extremist content that promoted terrorism.

Some advocates are calling for more assertive action on YouTube’s part, like cracking down on all types of children’s content.

“Why isn’t YouTube taking more serious steps, like temporarily shutting down all comments and recommendations; removing children’s content from YouTube; and using Google’s enormous reach to tell parents to keep children — and videos of children — off YouTube?” asks Golin. “Obviously, these would be drastic steps, but if having pedophiles openly trade information on your site doesn’t lead to drastic action, what will?” (YouTube would not comment on why it has waited this long to start addressing the problem of sexually suggestive comments on videos of children.)

But flagging content based just on the fact that it contains children is likely not a viable solution, mainly because there are seemingly endless videos of children playing with toys, reviewing games, and so on. And some of these toy and gaming influencers bring in huge paychecks — for themselves and for YouTube. Other creators say they’ve been trying to take on this comment moderation issue on their own, and don’t want to be punished.

“I’m not reporting the story because it negatively affects the whole YouTube community,” Daniel Keem, host of the YouTube show DramaAlert, tweeted. His show covers all the drama going on in the world of social media, and he fired back after a follower asked why he hadn’t brought up YouTube’s issues with sexually suggestive comments on children’s videos. “We don’t need another ad apocalypse. What I have done behind the scenes, though, is reached out to my YouTube contacts showing them the video and my team is showing them content to take down. This is not just about me. This is about all my friends, big and small creators. I’m not reporting something that’s going to affect their livelihoods.”

YouTube, so far, is trying to fix the problem short term by limiting ads on videos with children. On Twitter, it said that “even if your video is suitable for advertisers, inappropriate comments could result in your video receiving limited or no ads.”

This, of course, has rattled the community of YouTube stars, who are concerned their content won’t make money now. Content creators can file an appeal if their videos get flagged and the ads are removed. As one mom YouTuber tweeted: “MY 5 YEAR OLD SON: does gymnastics and is a happy, sweet, confident boy. youtube: NOT ADVERTISER FRIENDLY.”

Meanwhile, it’s clear that even if YouTube solves its children’s content problem, the task of cleaning up other concerning material on the site is going to be difficult. Just this past week, a pediatrician and blogger from Gainesville, Florida, Free Hess, said she’s found content promoting suicide on YouTube, and that even though she’s flagged the videos, they keep reappearing.

Posting this type of violent content is against the rules, but some of it isn’t searchable. Instead, Hess found that clips are hidden inside children’s videos. In one video, a man jumps in to say, “Remember, kids: Sideways for attention. Longways for results,” as he pretends to slice his arm.

“I think it’s extremely dangerous for our kids,” Hess told the Washington Post about YouTube. “I think our kids are facing a whole new world with social media and Internet access. It’s changing the way they’re growing, and it’s changing the way they’re developing. I think videos like this put them at risk.”

Searches for Peppa Pig and Doc McStuffins lead to knockoff videos of the franchises that are violent and inappropriate for children. Most recently, internet trolls have been capitalizing on the ongoing scare of the “Momo challenge,” where a creepy, bug-eyed character (which is actually a Japanese sculpture) supposedly tells children to harm or kill themselves. Although the theory that kids are taking their lives because of Momo is a viral hoax, moms say they’re finding that Momo is now actually appearing in kids’ videos on YouTube, spliced into content to scare them (although YouTube denies the challenge is being promoted in videos on its site).

Tackling the issue of cleaning up content on YouTube won’t be an easy fix, and it won’t happen overnight. But it’s a pressing matter for both parents and children, and if their concerns won’t get the tech giant to act, then Google potentially losing revenue dollars because of an advertiser exodus might.

Want more stories from The Goods by Vox? Sign up for our newsletter here.

Author: Chavie Lieber

Read More

RSS
Follow by Email