As the government’s focus on promoting online safety for children intensifies, digital service providers need to start taking stock of their processes and procedures – regardless of whether or not children are their intended audience. 

Contributors: Izzy Gould and Tori Lethaby

What is changing?

Protecting children and their personal data online has become a growing priority in the UK in recent years – highlighted by growing measures such as the ICO’s introduction of the Children’s Code in 2021 (which introduced a requirement for online service providers to implement stronger privacy protections for children), and the passing of the Online Safety Act in 2023 (which introduced new legal duties for online service providers to protect their users – and, in particular, children – from illegal or harmful content, imposing significant fines for non‑compliance). 

In recent weeks, the subject of children’s safety online has attracted growing attention from legislators and regulators alike. In March and April 2026 alone: 

  • The ICO published an open letter calling on technology companies to strengthen age‑assurance measures to prevent children accessing services not intended for their use. 

  • The ICO and Ofcom issued a joint statement clarifying how service providers can comply with both the Online Safety Act 2023 and UK data protection laws when implementing age‑assurance measures. 

  • The House of Lords showed growing support for proposals to restrict under‑16s’ access to social media and, more recently, voted by a majority to ban mobile phones during the school day. 

  • A claimant, who began using social media as a child, succeeded in a claim against Meta for mental health harms linked to her social media addiction – bringing renewed attention to addictive platform design and its impact on children. 

It is therefore clear that, even following the implementation of the Online Safety Act, there are still continuing concerns that greater protections are needed to keep children, and their data, safe online – and the government has accordingly signalled the potential for a further crackdown on harmful online content and addictive platforms.  

How could the changes affect your business?

These developments are relevant to any organisation offering digital services which could be accessed by children – intentionally or otherwise.  

Increased scrutiny is being placed on how children’s personal data is collected, used, shared, and profiled – particularly where data drives addictive design features, or targeted and/or harmful content. It would therefore be a prudent to review your policies to ensure that they comply with the ICO’s Children’s Code. This includes reviewing Data Protection Impact Assessments, providing clear and age‑appropriate information to children, limiting profiling, defaulting to ‘high privacy’ settings, moderating content appropriately, and using age‑assurance measures. 

There are already substantial penalties for organisations which fail to meet their legal obligations under the Online Safety Act, including fines of up to £18 million or 10% of a company’s global annual turnover – whichever is higher. The ICO can also impose UK GDPR fines; recent examples include Reddit (£14.47 million) and MediaLab (£247,590), for failing to implement age‑assurance measures, and for the unlawful processing of children’s personal data in a way which potentially exposed children to inappropriate, harmful content. 

Organisations should ‘watch this space’, as stricter enforcement will likely follow – particularly as the ICO and Ofcom are clearly working closely together in this area as a regulatory priority which could, in turn, increase the risk of enforcement action. 

What steps should you take to prepare?

If you haven’t already, you should prioritise assessing whether their services fall within scope of the Online Safety Act, and whether children are likely users of their goods and services. 

Where children are users, you should urgently work to meet the Children’s Code on factors such as content moderation, data minimisation, profiling, age assurance, and platform design.  

If your organisation processes children’s personal data, you should carry out a Data Protection Impact Assessment if you have not already done so; any existing assessments should be reviewed and updated to reflect new or evolving expectations, requirements, and guidance.  

Regulators have been clear that protecting children’s privacy online is an ongoing priority. Failure to act promptly increases the risk of regulatory enforcement, financial penalties under the Online Safety Act and UK data protection laws, reputational damage, and increased litigation risk where harm to children can be linked to data‑driven design choices. 

Key takeaways

  1. Assess whether your services fall under the Online Safety Act. 
  2. Determine whether children are, or likely to be, service users. 
  3. Ensure your operations comply with the Children’s Code. 
  4. Implement or update Data Protection Impact Assessments. 

Please be advised that this is an update which we think may be of general interest to our wider client base. The insights are not intended to be exhaustive or targeted at specific sectors as such, and whilst we naturally take every care in putting our articles together, they should not be considered a substitute for obtaining proper legal advice on key issues which your business may face.