Media Centre

The UK Enacts the Online Safety Act to Increase Obligations for Internet Platforms

26 September 2023

On 19 September 2023, the British Parliament had approved in a final vote the Online Safety Bill (the “Bill”). The Bill’s main goal is to protect children and adults online, and increase the responsibility and obligations of internet platforms with regards to the safety of their users. The UK Office of Communications (“OFCOM”) will be the designated authority that is entrusted with enforcing and implementing the Bill once it receives the final Royal Assent and becomes an Act.

The Bill is a part of an ongoing legal trend in various jurisdictions aiming to increase the transparency, reporting and content moderation obligations of internet platforms. Alongside the European Union’s Digital Services Act, which will come into force in February 2024, as well as the Digital Markets Act that will also start to be enforced in the the first quarter of next year, these legislations will create a new global standard of responsibilities for internet platforms, and specifically social media and content ones.

Failing to comply with the new regulatory regime may result in fines up to GBP 18 million (USD 22.3 million) or 10% of annual turnover, whichever higher.

 

Application

The Bill applies to three types of internet services and platforms (“Regulated Services”) that have “links to the United Kingdom”. Such links to the UK may be merely offering the service to individuals in the UK and if there are reasonable grounds to believe that the content presented on the service may pose a material risk of significant harm to individuals.

The Regulated Services include user to user services, search engine services and providers of pornographic content.

Only several industries are exempt from complying with the Bill, such as email services; SMS services; one-on-one live aural communication; internal business services; limited functionality services (e.g., services that only limited portions of their service allow users to communicate); and public bodies.

 

Duties of Care and Obligations

All internet services and platforms will be designated as a category 1, 2A or 2B services. The category will be dependent on the number of users (size) and the functionalities of that service. However, the thresholds for each category have not yet been determined and will be set out in secondary legislation.

Services that are user to user services and meet Category 1 thresholds, which will be those that are the highest risk and with the largest user-bases will be subject to additional duties. Other search engine services classified as category 2A or user to user services classified as category 2B (below the Category 1 threshold) will be subject to transparency reporting obligations.

Also, Regulated Services which are likely to be accessed by children will be subject to higher standard of requirements than those that do not fall under this category.

The main duties and obligations that Regulated Services will be subject to are the following:

1. Risk assessment: All Regulated Services must conduct an “illegal content risk assessment”, which includes the user base; the level of risk of users to encounter illegal content (e.g., terrorism content, child sexual exploitation content, etc.); the level of potential harm; the functionalities of the service that may disseminate illegal content; and how to design and operate the service in a manner that will reduce any identified risk.

Regulated Service may need to conduct a “children risk assessment” and an “adult risk assessment”, depending on whether their services are likely to be accessed by children or falls under the scope of Category 1, respectively.

2. Safety: These include the duty to moderate and monitor the presence of illegal content on the Regulated Service; to remove any illegal content upon awareness; and to specify in the terms of service the measures taken to protect individuals from illegal content.

Regulated Services that are likely to be accessed by children must also prevent children from accessing harmful and age-inappropriate content.

3. Freedom of expression and privacy: All Regulated Services must protect their users’ right to freedom of expression and from unwarranted infringements of privacy as part of the implementation of the safety duties pursuant to the Bill.

In addition, Category 1 Regulated Services will also need to take into consideration the protection of content with democratic importance and the protection of journalistic and news publisher content.

4. User reporting and redress: Regulated Services must operate a system that will allow users to report illegal or harmful content, including the creation of transparent complaints and redress procedures.

5. Record-keeping and review: All steps taken as compliance measures under the Bill must be kept and recorded in the Regulated Services’ systems.

6. Fraudulent advertising: Category 1 and Category 2A Regulated Services has additional obligations to prevent individuals from encountering content consisting of fraudulent advertising and minimize the length of time fraudulent advertising is presented.

7. Transparency: OFCOM will require all Regulated Services to produce an annual transparency report, with various user-to-user information such as the incidence of illegal content; the number of users who are assumed to have encountered illegal content or content that is harmful to children; measures taken to mitigate risks. OFCOM will have discretion with regards to which information should be published on the report.

8. Age gating: Providers of pornographic content are required to ensure children are not normally able to encounter pornographic content published on their service (e.g., by using age verification). These providers must also keep a written record of the measures they have taken to prevent children from encountering pornographic content. OFCOM will publish relevant guidelines to include examples and appropriate policies to meet this requirement.

 

Upcoming Steps

Once the Bill will receive Royal Assent, OFCOM shall commence the process of creating regulatory guidelines, a code of practice and procedures to implement the duties posed by the Bill and the applicable enforcement timeline.

Companies providing user to user services to users in the United Kingdom should evaluate their exposure to this new content moderation and monitoring regulatory regime. Feel free to contact us if you have any questions regarding the Bill ‎and its practical implications.

Search by +