YouTube has continued its efforts to "clean up" the platform's sometimes-questionable content. For years, YouTube has been a hit among parents as easy entertainment for their children. Educational content creators have built a network of free-to-access videos targeted specifically for babies, toddlers, and young children. Of course, marketers recognize YouTube's reach and buy ad placement on videos designed to reach their core audience. By allowing content creators to make money from their uploads, YouTube encourages more - and higher quality - content. But the choice to monetize some channels has proven to be problematic. According to YouTube, they do not monetize videos with "dangerous and harmful" content.
"We have strict policies that govern what videos we allow ads to appear on...We enforce these policies vigorously, and if we find a video that violates them, we immediately take action and remove ads,” a YouTube spokesperson said in an email statement to BuzzFeed News.
Last week, YouTube demonetized another problematic category of videos: those that promote anti-vaccination content YouTube representatives have stated that anti-vax content is a violation of the "dangerous and harmful" content policies.
RELATED: It's Wean Time For The Screen Time
The video platform took their stance against anti-vax content to the next level when they added a new fact panel displayed with any video that promotes anti-vaccination rhetoric. Fact panels now appear at the bottom of these videos with a link to a definition of "vaccine hesitancy". It also includes updates from The World Health Organization, who called vaccine hesitancy one of our current top ten global health threats.
One anti-vax YouTuber, Larry Cook, reported that his entire channel has been demonetized. He also claims that YouTube had not contacted him about this change.
Advertisers have reason to be wary. Only a week ago, kid-centric brand giants including Nestle, Hasbro, Kellogg, and Disney pulled ad money from YouTube. The family-friendly companies had discovered their ads on questionable video content, including some that were being commented on by a ring of pedophiles. According to the National Center on Sexual Exploitation, flagrantly sexualized videos are easy to find on YouTube. Just like the anti-vax videos, YouTube claims their advertising algorithms are designed to prevent this from happening.
Some companies weren't even aware their ads had been spotted on anti-vax channels. Nomad Health, a health tech company “does not support the anti-vaccination movement,” and claims they were “not aware of our ads running alongside anti-vaccination videos." The company also said they would "take action" to avoid this kind of situation in the future. Does that mean they plan to pull their content and advertising from the platform? Or will they refine their filters to avoid placement on troublesome videos? YouTube claims they are refining their own technology to better flag for anti-vaccine content, pornography, and other questionable videos.
Last year in the third quarter alone, YouTube shut down 58 million videos and 224 million comments. The video platform is in a rush to tighten up their content guidelines as advertisers ditch them over serious concerns. Since it's a go-to for many parents, it's important to know if YouTube is advertising to our kids. If so, what ads are they seeing? Can YouTube successfully screen out the stuff that isn't age-appropriate for little ones?
Over 300 hours of video are uploaded to video every minute. With that kind of demand, it's easy to see how a few tricky content creators can squeeze past the filters. Still, it makes both parents and advertisers wary of YouTube in general. Ultimately, marketers have been left unsure about the platform's "brand safety". This isn't just about maintaining a brand reputation, though. More importantly, the effort to clean up Youtube will protect our children from harmful anti-vaccine misinformation or other exploitative content.