YouTube used to be an unfiltered, lawless wasteland filled with edgy humour, problematic and potentially life-threatening content, and the now-infamous YouTube Poop content. The lucrative ad-filled days of the video-sharing platform’s content renaissance are long gone however after it had to battle countless controversies and face government rulings on user and child safety.
Issues with algorithmic indoctrination aside, today, YouTube aspires to be a progressive and friendly space for all, as well as a space for short-form video content. However, the platform may have gone too far in the wrong direction as it is now having to loosen changes it had recently made to its monetisation policy after outcry from top creators.
The former monetisation policy, which took effect in November 2022, saw users penalised for things like explicit imagery and language, and sexually gratifying content. Initially, YouTubers had to avoid swearing within the first 30 seconds of a video, as well as toning back anything deemed too shocking. The new policy however saw strong language outright blacklisting your content for ads. What resulted was channels bleeping every curse word, in fear of having to face YouTube’s temperamental safety and guidelines system.
Naturally, users were struggling to keep up with the constant changes to YouTube’s ad policy, changes that saw their videos becoming demonetised on a whim. Gaming channels were particularly affected by the intrusive policy, with simulated in-game violence being removed from ad-eligibility, despite the violence being inherently part of some gaming content, and not always graphic or gratuitous either.
Gaming content creator behind the popular channel RTGame shared his experience with the new policy in a video from 7 January 2023. He outlined in the update that the service is actively age-restricting and limiting his content, because older videos (content that had been generating revenue on the platform for years) were no longer in line with policy due to things like moderate profanity in the first 30 seconds, and issues with simulated in-game violence.
Elden Ring YouTuber Ymfah, who has been facing similar problems with getting his content approved, told SCREENSHOT that often “you aren’t even notified when a video gets demonetised,” making the process a lengthy and tiresome battle. Long story short, with the introduction of this controversial policy, YouTube was actively making the lives of those who rely on earnings from the video-sharing platform even more difficult.
In hopes of rectifying the mess made, YouTube’s March ad policy update has addressed the complaints, aiming to now allow more content for monetisation by loosening its strict guidelines and adding more words to its ‘moderate profanity’ category.
Words like ‘asshole’, ‘douchebag’, and other ‘moderate profanity’ can still be monetised. Essentially, the video hub is no longer going to treat all cursing equally, finally admitting that there’s a reasonable difference between the range and intent of the language used.
Of course, stronger language used early on in a video will still see the content being made unavailable for advertising, as well as if stronger curse words are repeated throughout the video. Harder words like ‘fuck’ remain dangerous ground to tread on—you’d be safer bleeping it out, or just avoiding it outright.
Also outlined in the recent changes are a clarification on which video game violence is and isn’t allowed. The policy update states that violence against a “real, named person” is against the guidelines and will make the video ineligible for monetisation.
Content creators have long battled YouTube for their right to freedom of expression, making this move a step back in the right direction for people trying to make a living behind the camera. Hopefully, potty mouths and gamers alike can enjoy a bit more flexibility and protection from the platform.