• Help & Advice
  • Find a Service
    Close icon
photo of teenager using laptop in an article about the online safety act

The Online Safety Act: what do the amendments mean for young people’s online usage?

For Sexual Health Week we are exploring how being chronically online is influencing our attitudes and behaviours around sex and relationships.

In this blog, Sian Hopkins, Education and Wellbeing Specialist in Wales, explains what the recent changes to the Online Safety Act 2023 mean for young people’s internet access and how we must adapt our messaging as educators.

On 17 March 2025 the new measures added by the UK Government to the Online Safety Act 2023 came into effect, stating “online platforms have a legal duty to protect their users from illegal content.”

The original Online Safety Act was passed in October 2023. Additions made in March 2025 gave the independent online safety regulator, Ofcom, the power to monitor how online providers protect users and to enforce sanctions when they fail to comply.


These duties were reinforced on 25 July 2025. Social media companies and search engine services now have a legal responsibility to protect children and young people from harmful or age-inappropriate content. Sites must use age-assurance tools, such as age verification, to stop children from viewing pornography or material that encourages self-harm, suicide or eating disorders.


Platforms must also provide clear and accessible ways for parents and children to report problems or inappropriate content.

What Counts as Illegal or Inappropriate Content?

The Act sets out forms of illegal content that companies must actively prevent and remove. This includes:

  • child sexual abuse
  • controlling or coercive behaviour
  • extreme sexual violence
  • extreme pornography
  • fraud
  • racially or religiously aggravated public order offences
  • inciting violence
  • illegal immigration and people smuggling
  • promoting or facilitating suicide
  • intimate image abuse
  • selling illegal drugs or weapons
  • sexual exploitation
  • Terrorism

All internet and social media platforms must remove illegal content that involves a victim once it is reported or otherwise brought to their attention. They must also take steps to stop it appearing in the first place.

Protecting Children and Young People

The Act also covers content that may not be illegal but is considered harmful for children. Companies must prevent young people from accessing this material.

Primary Priority Content includes:

  • pornography
  • content that encourages, promotes or gives instructions for self-harm, eating disorders or suicide
  • bullying
  • abusive or hateful content
  • content that depicts or encourages serious violence or injury
  • content that promotes dangerous stunts and challenges
  • content that encourages ingestion, inhalation or exposure to harmful substances

How the Law Will Be Enforced

Ofcom is responsible for ensuring that platforms meet the requirements of the Act. This applies to companies outside the UK if their services are accessible here.
If Ofcom believes a company is failing in its duties, it can take action. Fines can reach £18 million or 10 per cent of global turnover, whichever is higher. Senior managers who do not comply with Ofcom information requests can also face criminal action.

Age Verification

Platforms can check users’ ages in several ways, for example:

  • credit card checks, where a payment processor confirms the cardholder’s age
  • digital identity services such as identity wallets that securely share proof of age
  • email-based age estimation, using where the email has been used (for example with utility providers) to infer age
  • facial age estimation, where technology estimates age from a photo or video
  • mobile network operator checks, where the provider confirms whether an age filter is applied
  • open banking, where users allow an age-check service to confirm their age from bank data
  • photo ID matching, where a user uploads an ID document and a current photo for comparison

Implications for Educators

At Brook we teach young people how to stay safe online and explain that the law is there to protect them, not to punish curiosity. Pornography can be a source of exploration around sex and sexuality, and it is important to stress that these legal changes are not about criminalising young people for viewing it. Responsibility lies with the companies that fail to protect under-18s.

Pornography can be harmful, particularly if accessed accidentally or without understanding, and can give misleading ideas about sex. This is why high-quality, age-appropriate PSHE or RSE education is essential. When children and young people receive accurate, balanced information about sex and relationships, they are less likely to rely on pornography or harmful online sources and are better protected both online and offline.