Media Centre

EU Agrees on the Digital Services Act to Regulate Internet Platforms

26 April 2022

The Council and the European Parliament have reached a provisional political agreement on the draft Digital Services Act (“DSA“), which is now expected to progress to its final legislative process.

As previously reported in our client update, the DSA defines new standards that will govern the digital sphere and counter the dissemination of illegal and harmful content. The new act provides a comprehensive framework for platform accountability, governance, content moderation and systemic risks.

The DSA introduces a comprehensive framework for content moderation, platform accountability, and governance which will apply to online intermediary services, which include:

  • Intermediary services offering network infrastructure – such as internet access providers and domain name registrars);
  • Hosting services – such as cloud computing and webhosting services;
  • Very large online search engines with more than 10% of the 450 million consumers in the EU;
  • Online platforms bringing together sellers and consumers – such as online marketplaces, app stores, collaborative economy platforms and social media platforms; and
  • Very large online platforms with a reach of more than 10% of the 450 million consumers in the EU.

 

The DSA will be applicable for the aforementioned companies that offer digital services in the EU, including said companies established outside of the EU. Violation of the DSA may result in fines of up to 6% of global turnover or even a ban on operating in the EU single market in case of repeated serious breaches.

The act is now subject to formal approval and, once adopted, it will be directly applicable across the EU and will apply fifteen months from the entry into force, or from 1 January 2024 – whichever later. For platforms defined as very large online platforms, the act will apply four months after its enactment.

The key regulatory requirements under the DSA include the following:

  • User-facing transparency of online advertising: with regards to targeted advertising, online platforms should –
  • Clearly inform users in an unambiguous manner and in real time, whether and why they are targeted by each ad and who paid for the ad, and enable users to see when content is sponsored or organically posted on a platform;
  • Provide meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed;
  • Not target children, including by using any of their sensitive data, such as sexual orientation, political beliefs or religious views;
  • Allow users to opt-out in an easy manner.

 

  • Notice and action and obligation to provide information to users: to the extent that an online platform recommends on content or products, it must transparently explain the main parameters for the recommendation, including regarding the algorithms used for recommending the content or products to users, and allow users the choose whether or not to display such recommendations.

 

  • “Dark patterns” prohibited: online platforms should not use techniques that can manipulate users into doing something against their will, such as: repeatedly requesting users for their consent, make it difficult to cancel a service, give more visual prominence to any of the consent options, presenting choices in a non-neutral manner, etc.

 

  • Transparency reporting: online platforms should respond to complaints and inquiries from users whose content was removed for being contrary to their terms and conditions or deemed as illegal content;

 

  • Reporting illegal content and criminal offences: online platforms should implement mechanisms for:
  • Notifying content considered illegal by users;
  • Ensuring their internal complaint-handling systems are easy to access, user-friendly and facilitate submission of complaints and the reporting of complaints exclusively by electronic means;
  • Informing judicial authorities of the relevant EU state of its suspicion of criminal offences and provide relevant information.

 

  • Measures against abusive notices and counter-notices: online platforms should suspend abusers (users who frequently submit notices or complaints that are manifestly unfounded) for a reasonable period of time, after issuing a prior warning;

 

  • Complaint and redress mechanism, out of court dispute settlement: online platforms should have the possibility to challenge its content moderation decisions, and inform users of the possibility of out-of-court dispute settlement and other available redress possibilities, and that decisions are not solely made on the basis of automated means;

 

  • Trusted flaggers mechanism: online platforms must set out effective mechanisms to counter illegal content, goods and services by empowering and cooperating users who have demonstrated their expertise to report illegal content successfully to flag illegal goods, services or content online. In this framework, online platforms should ensure that notices submitted by trusted flaggers are processed and decided upon with priority and without delay;

 

  • Vetting credentials of third-party suppliers: online e-marketplaces will have to implement their own due diligence checks to fight against counterfeit goods and vet that traders are reliable.

 

Lastly, the DSA has set supplementary obligations to very large online platform, such as higher standard of transparency and accountability (annual risk assessments, request for information by authorities, on-site inspection).

While the DSA presents increased scrutiny firstly towards big internet players, it will certainly influence both directly and indirectly all business entities operating in the EU online marketFeel free to contact us if you have any questions regarding the effect of these developments on your company’s practices.

 

Search by +