As Facebook’s failures and conscious negligence of protecting its users from explicit content continue to surface, the Office of Communications (Ofcom) steps up to enforce some stricter rules for certain platforms. In the UK, apps like TikTok and Twitch could soon face large fines and heavy restrictions if they fail to adhere to the government-approved regulatory body’s new rules. US Representative Alexandria Ocasio-Cortez (commonly referred to as AOC) cites that a large issue in monitoring Facebook is its monopoly over itself and all its acquired platforms—arguing for it to be subject to independent system checks like TikTok and Twitch.
The politician wrote on Twitter, “If Facebook’s monopolistic behaviour was checked back when it should’ve been (perhaps around the time it started acquiring competitors like Instagram), the continents of people who depend on WhatsApp and IG for either communication or commerce would be fine right now. Break them up.” She further stated how such a monopoly could have potentially dangerous implications on democracy as we know it.
Ofcom is an authoritative regulatory body, approved by the UK government, that monitors and oversees industries pertaining to telecommunications, broadcasting (both television and radio) and even postal sectors in the country. The organisation’s responsibilities lie in a multitude of areas including: complaints, codes and policies, licensing, competition, research and protecting radio industries from abuse. Now, it has released some new rules for video-sharing platforms (VSPs) operating in the UK.
In a first for Europe, VSPs like Snapchat, Twitch, Vimeo, OnlyFans (which has had its fair share of content moderation controversies) and of course, TikTok are now subject to million-pound fines—or in more serious cases face site-wide suspensions—by Ofcom if they fail to clampdown on hate speech, child sexual abuse material and inappropriate content evident on their platforms. While the broadcasting regulatory’s main priorities is child abuse material, other rules include a crackdown on terrorism-related content and racism.
Ofcom found that a third of VSP users have come across such content on the above listed sites. So, in order to protect such users the organisation, although itself unable to assess individual content, will heavily monitor a platform’s ability to act swiftly and effectively in removing content that violates guidelines. The VSPs will be required to prepare and adequately impose crystal clear guidelines for uploading content, develop an easy-to-use report and complaint process and have vigorous age-verification restrictions on certain content.
It seems such users have had enough of witnessing such content online as the whole of Twitch has been leaked. VGC has reported that an anonymous user has posted a 125 GB torrent link to 4chan believed to contain the comprehensive source code history, reports of creator payouts, Twitch clients and more; the leak (which is publicly available) was reportedly conducted to “foster more disruption and competition in the online video streaming space” as “their community is a disgusting toxic cesspool.” This is not the first time Twitch has come under fire from its users for its failure to act in moderating abuse, as many boycotted the site over ‘hate raids’ targeted at black and LGBTQ streamers.
If the new rules set out by the UK watchdog are violated or not rigorously maintained, Ofcom will be allowed to exact a penalty of up to 5 per cent of a platform’s turnover or, £250,000. Such rules and punishments will only be applicable to VSPs that have a UK regional headquarters—meaning platforms like Netflix and YouTube are exempt from such moderation from Ofcom and rely on other bodies in their regions of location to conduct such monitoring. YouTube for example, would have to answer to Irish authorities.
Chief Executive of Ofcom Dame Melanie Dawes, stated that “online videos play a huge role in our lives now, particularly for children… But many people see hateful, violent or inappropriate material while using them.” She continued, “The platforms where these videos are shared now have a legal duty to take steps to protect their users. So we’re stepping up our oversight of these tech companies, while also gearing up for the task of tackling a much wider range of online harms in the future.”