Join our mailing list to get regular email updates and info on what we're up to!
If you are under 18, please make sure you have your parents’ permission before providing us with any personal details.
For Sexual Health Week we are exploring how being chronically online is influencing our attitudes and behaviours around sex and relationships.
In this blog, Sian Hopkins, Education and Wellbeing Specialist in Wales, explains what the recent changes to the Online Safety Act 2023 mean for young people’s internet access and how we must adapt our messaging as educators.
On 17 March 2025 the new measures added by the UK Government to the Online Safety Act 2023 came into effect, stating “online platforms have a legal duty to protect their users from illegal content.”
The original Online Safety Act was passed in October 2023. Additions made in March 2025 gave the independent online safety regulator, Ofcom, the power to monitor how online providers protect users and to enforce sanctions when they fail to comply.
These duties were reinforced on 25 July 2025. Social media companies and search engine services now have a legal responsibility to protect children and young people from harmful or age-inappropriate content. Sites must use age-assurance tools, such as age verification, to stop children from viewing pornography or material that encourages self-harm, suicide or eating disorders.
Platforms must also provide clear and accessible ways for parents and children to report problems or inappropriate content.
The Act sets out forms of illegal content that companies must actively prevent and remove. This includes:
All internet and social media platforms must remove illegal content that involves a victim once it is reported or otherwise brought to their attention. They must also take steps to stop it appearing in the first place.
The Act also covers content that may not be illegal but is considered harmful for children. Companies must prevent young people from accessing this material.
Primary Priority Content includes:
Ofcom is responsible for ensuring that platforms meet the requirements of the Act. This applies to companies outside the UK if their services are accessible here.If Ofcom believes a company is failing in its duties, it can take action. Fines can reach £18 million or 10 per cent of global turnover, whichever is higher. Senior managers who do not comply with Ofcom information requests can also face criminal action.
Platforms can check users’ ages in several ways, for example:
At Brook we teach young people how to stay safe online and explain that the law is there to protect them, not to punish curiosity. Pornography can be a source of exploration around sex and sexuality, and it is important to stress that these legal changes are not about criminalising young people for viewing it. Responsibility lies with the companies that fail to protect under-18s.
Pornography can be harmful, particularly if accessed accidentally or without understanding, and can give misleading ideas about sex. This is why high-quality, age-appropriate PSHE or RSE education is essential. When children and young people receive accurate, balanced information about sex and relationships, they are less likely to rely on pornography or harmful online sources and are better protected both online and offline.