Media Centre

The UK Online Safety Act – Ofcom Releases First Set of Regulatory Guidelines

23 December 2024

The UK Online Safety Act (“Act“) has entered its implementation phase. On 16 December 2024, Ofcom, the UK communications regulator, published its first major policy statement and accompanying regulatory guidelines, which focus on safeguarding individuals from illegal online harms. The statement outlines notable developments for in-scope online service providers, including user-to-user and search services (“Providers“). Read here our previous update on the enactment of the Act and its main objectives.

 

Providers are now required to conduct a risk assessment of illegal harms on their services, with the deadline set for 16 March 2025. Furthermore, beginning 17 March 2025, Providers are required to either implement the safety measures outlined in Ofcom’s Illegal Harms Codes of Practice or demonstrate the adoption of alternative measures that effectively mitigate the risks related to illegal content and activities.

 

Key Highlights and Requirements

To assist Providers in navigating the requirements of the Act, Ofcom has released the statement alongside various codes and decisions. These materials are organized into multiple volumes, regulatory documents, and annexes, each addressing specific aspects of the Act and its associated responsibilities, including:

 

  1. Risk assessment guidance and risk profiles: This guidance helps Providers identify risks of illegal content on their platforms and implement safety measures to protect users in the UK. It includes risk profiles to evaluate platform features contributing to illegal content and a Register of Risks offering detailed analysis of associated risks.

 

Providers must, by 16 March 2025, assess the risk of illegal harms on their services, considering factors such as their user base, functionalities and content. This includes identifying potential risks related to priority illegal content such as terrorism, child sexual abuse material (“CSAM“), and fraud. Based on their risk assessments, Providers are required to take proportionate steps to mitigate and manage the identified risks. This can involve implementing age assurance technologies, proactive content moderation, clear terms of service, and user reporting mechanisms.

 

  1. Illegal content judgements guidance: This guidance provides a framework for Providers to assess whether online content constitutes “illegal content”, covering both priority offences and relevant non-priority offences. The guidance highlights the concept of “reasonable grounds to infer”, requiring Providers to evaluate evidence, consider context, and make informed judgements about the presence of illegal activity.

 

When encountering potentially illegal content, Providers must assess whether there are “reasonable grounds to infer” that the content constitutes a relevant offence under UK law. They need to consider factors like the context of the content, the user’s intent, and any applicable legal defenses.

 

  1. The illegal content codes of practices, for user-to-user and search services: These codes address key duties such as governance, content moderation, user choices, reporting and complaints processes, and the use of proactive technology, with adjustments tailored to user-to-user or search services.

 

Providers are required to implement measures to remove all types of illegal content, including hate speech, terrorist content, fraud, and CSAM. This requires appropriately resourced and trained content moderation teams and the use of effective algorithms. In addition, Providers must establish clear and accessible complaints procedures for users to report suspected illegal content and other concerns. They also need to handle complaints efficiently and fairly. Lastly, reporting channels for users must be established to swiftly identify and address fraudulent activity, helping to mitigate financial and psychological harm.

 

The sources highlight the need to implement safety defaults for children’s accounts, restricting contact with unknown adults, and taking action against the sharing of illegal content such as CSAM. Additionally, Providers must actively mitigate harms against women and girls, including enabling blocking and muting functionalities and taking down content related to intimate image abuse.

 

  1. Guidance on content communicated “publicly” and “privately”: This guidance assists Providers of user-to-user services in applying measures to content communicated publicly. It details the factors Providers should consider when making the risk assessment, including the number of UK individuals able to access the content, any access restrictions, and the ease of sharing or forwarding.

 

  1. Record-keeping and review guidance: This guidance requires Providers to create and maintain written records of their risk assessments and any measures implemented to comply with the Act.

 

Providers are required to keep thorough records of their risk assessments, safety measures, content moderation processes, and complaint handling procedures. These records must be clear, accessible, and up-to-date to demonstrate compliance with the Act.

 

Sanctions and Enforcement

With publishing its Online Safety Enforcement Guidance, Ofcom is prepared to enforce compliance with the Act through significant penalties, including fines of up to £18 million or 10% of a Provider’s global revenue, whichever is greater, as outlined in our previous update. For severe violations, Ofcom may seek court orders to block non-compliant services from operating in the UK.

 

Future Developments

The Act’s implementation will be phased, with further codes and guidance from Ofcom being introduced during 2025. The Codes of Practice are anticipated to complete the Parliamentary process ahead of the 17 March 2025, deadline, when Ofcom will begin enforcing the illegal harm safety duties.

 

How to Prepare Your Business for Compliance?

Ofcom’s first set of regulatory guidelines represents a notable milestone in the implementation of the Act. With a three-month timeframe to comply, Providers must promptly assess their obligations and determine whether their services may expose users in the UK to illegal content.

 

For assistance or guidance on implementing the required measures under the Act in the UK, please do not hesitate to reach out:

Ariel Yosefi – yosefia@herzoglaw.co.il

Ido Manor – manori@herzoglaw.co.il

Liron Adar – adarl@herzoglaw.co.il

Search by +