The Next Move

UK online safety deadlines loom, impacting 100K companies worldwide

The issue

The United Kingdom in 2023 enacted the Online Safety Act (OSA), a sweeping digital safety law that will be enforced in phases throughout 2025. What’s striking about the law, beyond its many safety requirements (particularly around children), is its broad reach. The rules affect thousands of businesses of all sizes worldwide, far exceeding the scope of the EU’s Digital Safety Act (DSA) and similar online safety frameworks. Companies that view the OSA as the “UK's DSA” may be surprised to learn that it applies to a much wider range of tech businesses, not just the biggest global platforms.

In fact, it applies to more than 100,000 user-to-user (U2U) and search services providers that target the UK market or have many UK users — including social media apps, video sharing platforms (VSPs), private messaging apps, online marketplaces, gaming sites, search engines and others — even if they’re small or based overseas. For most, OSA compliance will be a significant challenge, especially for providers without regulatory compliance infrastructure or governance models in place.

While the measures seek to make the UK the world’s safest place to be online, many argue the law doesn’t go far enough, especially on disinformation (following the UK riots in 2024) and children’s access to self-harm and suicide content. Further tightening of UK online safety laws hasn’t been ruled out.

The OSA reflects the current atmosphere around content moderation and child online safety in much of the world. Even the United States is generally aligned on protecting children, as are individual states like California, though the Trump administration’s focus on freedom of speech could spur increased regionalization of services.

Compliance deadlines are fast approaching, starting with an illegal content risk assessment due on March 16, 2025. Penalties for noncompliance can include fines up to £18 million or 10 percent of the company’s qualifying worldwide revenue, whichever is greater. In serious cases, Ofcom, the OSA’s enforcement body, may seek a court order to withdraw or limit access to services in the UK. Given this, it’s imperative that affected companies begin taking steps immediately to implement systems and processes that reduce the risk of illegal or harmful content.

The regulator’s take

The OSA makes online service providers legally responsible for protecting the safety of UK users. It creates a binding “duty of care” for platforms to act against illegal or harmful content, which Ofcom deems an increasingly widespread problem among online users. The law aims to deliver change in this area through:

  • Stronger safety governance and risk management
  • Online services designed and operated with safety in mind
  • More choice for users to control online experiences
  • Greater trust built through increased transparency and accountability

Covered entities. The OSA applies to certain online services with UK users, even if they’re based elsewhere. These include:

  • U2U services allowing users to post content and interact online (including via private messaging)
  • Search engines providing search functionality
  • Sites publishing or displaying pornographic content

Providers can use Ofcom’s Regulator Checker to find out if OSA rules apply to them.

Duties of care. Covered entities must satisfy online safety requirements for OSA compliance. These include illegal harms duties, child safety duties and additional obligations for categorized services. In most cases, this involves conducting risk assessments and implementing and recording the applicable safety measures. Ahead of enforcement, Ofcom periodically publishes draft codes of practice and guidance to help determine which duties, measures and additional requirements service providers must meet.

Safety measures. Ofcom has identified eight areas where it believes the risk of harm is greatest and where providers can take the most effective mitigation steps

Service providers can improve safety in these areas with Ofcom’s risk-based codes of practice, which recommend a mix of governance-focused, cross-cutting and harm-specific measures. Some measures will apply to all services, while others won’t. Which measures apply will depend on several factors including service type, features and functionalities, and number of users.

Implementation rollout. Ofcom has reached several major milestones in its implementation plan thus far. Although final guidance for most duties is still under consideration, Ofcom encourages service providers to take steps to comply with the new rules right away. The most recent roadmap outlines three distinct implementation phases along with key compliance resources in-scope you can use to get started. Keep in mind, though, it’s possible that these dates may change.

  • Phase 3, additional compliance duties: Eventually, services meeting certain thresholds could be designated as category 1, 2A or 2B and face additional requirements. Secondary legislation setting these thresholds will be based on Ofcom’s advice and is expected by summer 2025. A register of categorized services and a list of emerging category 1 services will be published sometime in late 2025.

OSA compliance in a broader context. Faced with these many challenging compliance duties, some companies may be tempted to adopt short-term solutions such as geo-blocking UK users. While this approach may temporarily spare them from OSA compliance, it doesn’t promote safety-by-design principles, and it avoids the core issue that companies need to confront — the prevalence of illegal content and potential disinformation that’s gaining the attention of policymakers worldwide. With other online safety frameworks emerging in Australia, Canada and elsewhere, organizations should view OSA compliance as preview of what’s coming globally and prepare accordingly.

Your next move

Your risk assessments for illegal harm and children’s access will dictate the steps you should take and the codes of practice will indicate what recommended measures apply based on your assessments. Some providers, especially those subject to the EU’s DSA, may already have some of the necessary infrastructure in place and can build on existing risk assessment processes and safety measures to satisfy OSA requirements. For others, OSA compliance could be the beginning of their digital safety regulatory transformation, which may require some investment. Consider taking the following steps.

  1. Assess illegal harms risk. For many companies, illegal harms risk assessments are already underway. Large platforms that have previously conducted DSA system risk assessments will need to build on existing frameworks, which can be a heavy burden — especially as compliance fatigue sets in. If you’re starting from scratch, begin by reviewing the illegal harms risk assessment process, focusing on the specific measures relevant to your services. Ofcom has released extensive guidance — so the key is to develop a sustainable framework that fits your organization’s needs. In the first year, expect this process to be largely manual, but there will be opportunities to refine, streamline and automate over time. The DSA’s publicly available system risk assessments (SRAs) offer rich insights for companies seeking to understand different approaches to digital safety risk evaluation and mitigation.
  2. Assess children’s access risk. Determine whether it’s possible for children to access your service. Review the guidance for children’s access assessments and determine first if you have highly effective age assurance in place to age-gate your service, or part of it. If children are likely to access all or part of your service, further risk assessments are required that need to be reviewed regularly thereafter and safety duties will apply. Although final guidance hasn’t been published for children’s risk assessments, Ofcom encourages all providers likely to fall in scope to start preparing.
  3. Identify gaps and build a compliance roadmap. Begin by assessing your current safety measures against the codes of practice, which provide the most reliable and defensible path to compliance. Review which measures are recommended for your service’s size and risk level, identifying gaps where existing measures may be insufficient or require remediation. Prioritization is key. Concentrate on addressing the highest-risk areas first while developing a roadmap for improvements. If opting for alternative measures, carefully document how they align with relevant duties. Compliance will be an evolving process, so a structured, phased approach can help you maintain focus.
  4. Enhance governance measures. Specific measures required will depend on the size, type and risk level of your service, but at a minimum you’ll need to name an individual accountable to the most senior governance body who’s responsible for compliance with the illegal content safety duties and reporting and complaints duties. Other obligations may include establishing written codes of conduct and statements of responsibilities, implementing monitoring and assurance and/or conducting training, among others.
  5. Plan for transparency. If you anticipate being a categorized service, start planning for transparency reporting now. Define transparency reporting and data strategies ahead of transparency notices in late 2025. Identify stakeholders, locate data, decide how you’ll collect and analyze it and determine how you’ll demonstrate its auditability.
Follow us