If you’ve noticed less violent content when scrolling social media, or that pornography websites are now asking users to verify their age with a government ID, it’s because of a new UK law aiming to make the internet a safer space for young people.
The government implemented the Online Safety Act on 25 July 2025, and its main aim is to stop young people from accessing ‘Primary Priority Content’—namely pornography, but also content that encourages eating disorders, self-harm, and suicide.
Social media platforms like Reddit and Bluesky have started to request ID to prove you’re over 18, or have one of your selfies algorithmically verified as belonging to an adult.
Although it may feel like this is a sudden change, the recent change is actually an Ofcom-mandated implementation of the act passed in 2023. The set of laws aims to “protect children and adults online” and puts a range of new duties on social media companies and search services, “making them more responsible for their users’ safety on their platforms.”
While the act aims to protect those under 18 from illegal, harmful, and age-inappropriate content, just over a month since its implementation, we’re seeing censorship of discussions around sexual health, transitioning, and politics, among other important conversations. It leads to the question: Is the act protecting young people, or is it limiting us from accessing key information?
Firstly, let’s look at what the Online Safety Act actually sets out. In a government explainer, it says that the law will protect children and adults online and “give providers new duties to implement systems and processes to reduce risks their services are used for illegal activity, and to take down illegal content when it does appear.”
Also, that “platforms will be required to prevent children from accessing harmful and age-inappropriate content and provide parents and children with clear and accessible ways to report problems online when they do arise.”
It adds that Ofcom is the independent regulator of Online Safety, which will “set out steps providers can take to fulfil their safety duties in codes of practice.” And that Ofcom has a “broad range of powers to assess and enforce providers’ compliance with the framework.”
The recent age-verification enforcement has rippled through online settings. But what are young people actually seeing and feeling about the implementation of the new law? Sky News spoke to young people about what they’ve experienced since the new rules came into force. One 17-year-old said that the internet was a “very, very malicious” place before and described frequently seeing inappropriate content, but now their algorithm seems “tamed.”
Another 16-year-old said they had previously been served a lot of eating disorder content, but “in the time that the rules have been in place, I don’t actually think I’ve seen any. I used to see them every few scrolls, so it’s very much gone down.”
Young people also told Sky News that they now feel less worried about scrolling, saying they “can actually scroll on the internet worry-free of what’s going to pop up.”
However, while it’s positive that young people are apparently seeing less incriminating content while scrolling, there is also the worry that the laws are censoring content, claiming certain topics breach “local laws.”
One user on X experienced this; they were blocked from seeing content about the ongoing conflict in Palestine, writing: “All the evidence I’ve seen on my timeline suggests that the Online Safety Act has nothing to do with porn and everything to do with restricting information on Palestine.”
All the evidence I’ve seen on my timeline suggests that the Online Safety Act has nothing to do with porn and everything to do with restricting information on Palestine. pic.twitter.com/BBrj6u7KNE
— Rangzen (@revoltinghippie) July 29, 2025
Another post on X says: “Just a slight heads up for anyone in the UK. Twitter is restricting some Palestinian videos due to the Online Safety Act that’s meant to protect minors from NSFW content. Censorship affects everyone, it’s a slippery slope that will always be used to silence what they want.”
Just a slight heads up for anyone in the UK.
Twitter is restricting some Palestinian Videos due to the Online Safety Act that’s meant to protect Minors from NSFW content.
Censorship affects everyone, it’s a slippery slope that will always be used to silence what they want. pic.twitter.com/jvPYyy8Gne
— m ✰ (@vmohv) July 25, 2025
It’s not just content about Palestine, either. “We’re seeing teenagers being prevented from accessing content about sexual health, politics and news. If adults want to see this content, they are forced to prove how old they are with unregulated age verification companies,” James Baker, Platform Power Programme Manager at Open Rights Group, tells Dazed. “Meanwhile, small websites are closing down because they are worried about being fined because of these onerous demands. Ofcom needs to take stock of these threats to freedom of expression and Parliament needs to reform the Online Safety Act.”
The ‘unintended’ consequences of the act seem to be silencing people’s voices and limiting young people from taking part in and reading about important conversations, not just blocking them from seeing harmful content.
Baker isn’t the only person calling for the government to take stock of the impact of the Online Safety Act. A petition with over 500,000 signatures has emerged calling for the Online Safety Act to be repealed. It states: “We believe that the scope of the Online Safety Act is far broader and restrictive than is necessary in a free society. For instance, the definitions in Part 2 cover online hobby forums, which we think do not have the resource to comply with the act and so are shutting down instead.”
“We think that Parliament should repeal the act and work towards producing proportionate legislation rather than risking clamping down on civil society talking about trains, football, video games or even hamsters because it can’t deal with individual bad faith actors.” But, in reply, the Department for Science, Innovation and Technology says the government has “no plans to repeal” the act.
Is the ‘far restrictive’ scope just teething issues of a new act, or can we expect greater online censorship as further steps in digital safety are set out? Either way, there has to be a better way to protect young people from seeing truly harmful and disturbing content through platform moderation, instead of simply blocking us from anything deemed adult-only. How are young people to learn about politics, sexual health, and important topics if they cannot access information on these? The internet can’t be segregated into ‘child-friendly’ and ‘over-18’ camps forever.