Alert
September 6, 2023

Does the New EU Digital Services Act Apply to Me?

On 25 August 2023, the EU’s Digital Services Act (“DSA”) came into effect for very large online platforms and very large online search engines. The DSA becomes fully applicable to other entities within its scope on 17 February 2024.

The DSA is one of the EU’s landmark digital laws that aims to better protect the rights and safety of users of digital services online and harmonises the obligations placed on online platforms — in particular, with regard to reducing the volume of illegal content found on such platforms.

Who does the DSA apply to?

The DSA applies to all online intermediary service providers who offer ‘mere conduit’, caching or hosting services in the EU. The nature of these services is broadly cast and covers: 

  • intermediary services offering network infrastructure such as internet access providers, domain name registrars and online search engines
  • hosting services such as cloud and web hosting service providers
  • online platforms (which are a subset of a hosting service), such as marketplaces, app stores and social media platforms

The DSA applies to online intermediary service providers regardless of whether they are established inside or outside the EU. The obligations imposed on these online intermediary service providers are tiered and proportionate to the ability and size of the online intermediary service providers in question, with greater cumulative obligations imposed the further down the list of online intermediary service providers bullet-pointed above. 

The DSA establishes material fines of up to 6% of global turnover for breach, and there is a private right of action for individual recipients of services (“users”).

VLOPs and VLOSEs 

Very large online platforms (VLOPs) and very large online search engines (VLOSEs), which reach more than 45 million consumers in Europe, are given particular attention and are subject to the specific obligations due to their size and the potential impact they can have on society. There is more detail on this below.

So far, the VLOPs designated by the Commission are Alibaba Aliexpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Shopping, Instagram, LinkedIn, Pinterest, TikTok, Twitter, Wikipedia, YouTube, and Zalando. The designated VLOSEs are Bing and Google Search. However, there are other platforms in the Commission’s sight, so this list is likely to expand.

Amazon and Zalando have both disputed their designations in cases that will be closely watched by the market. 

What are the immediate next steps for other online intermediary service providers?

With six months left until the DSA comes into effect for other providers, businesses should have established whether they are in scope and ensure they have a compliance plan underway.

Once applicable, what core obligations will the DSA impose on online intermediary service providers?

The DSA carries over existing laws providing a safe harbour for online intermediary service providers in certain circumstances where they act as mere conduit or provide a hosting or caching service. 

The DSA reiterates that there is no general duty on online intermediary service providers to monitor information or actively investigate suspected illegal activity. In this respect, the DSA can be compared to the UK’s proposed Online Safety Bill, which will require all online intermediary service providers to carry out illegal-content risk assessments and imposes a duty of care to mitigate and manage the identified risks. This level of responsibility is reserved to the VLOPs and VLOSEs in the DSA.

But in a significant change, all online intermediary service providers are now subject to specific obligations to cooperate with orders issued by national authorities relating to any specifically identified illegal content and to provide information about users of their services when ordered to do so. The DSA applies the rules of the physical world to the digital world, so anything that is illegal offline is also illegal online.

Aside from this, at a minimum, online intermediary service providers will be required to:

  • publish annual reports detailing any content moderation they engaged in during the relevant period
  • specify in their terms and conditions any restrictions they may impose on users of their service as it relates to information the user provides 
  • appoint a point of contact (or where situated outside of the EU but offering services within the EU, appoint a legal representative) to enable them to communicate with Member State authorities, the European Commission and the to-be-established European Board for Digital Services
  • appoint a single point of contact to enable users to communicate with them and provide necessary contact information

Hosting providers are now subject to a notice and takedown mechanism, bringing the EU in line with US practices, and to an obligation to explain to users any actions taken in relation to their content that is considered to be illegal. Hosting providers must also report on suspected criminal offences involving a threat to life or safety. 

Online platforms that disseminate information to the public on behalf of users are a subset of hosting providers and are treated to a further layer of obligations, including the following:

  • creation of a complaint and redress mechanism and out-of-court dispute settlement
  • recognition of trusted flaggers (a status awarded by a supervisory authority), whose notices of illegal content are prioritised
  • imposition of measures against abusive notices and counter-notices
  • enhanced transparency obligations, including monthly active EU users and identification of advertising
  • regulation of their online interfaces to prevent deception or manipulation that impairs free and informed decisions
  • specific protections designed to protect children, including banning targeted advertisements to children and those based on special characteristics of users

There are further special obligations for marketplaces, e.g. vetting credentials of marketplace sellers, implementing interface design that enables marketplace sellers to comply with their obligations under applicable EU law, and random checks for illegal goods sold on their platform.

Finally, for those designated as VLOPs and VLOSEs, the DSA introduces significant new obligations that are already driving technical and operational changes within these businesses. Some of the more notable obligations include:

  • identification and assessment of systemic risks stemming from the design or functioning of their service, including algorithmic systems, and provision of reasonable, proportionate and effective mitigation measures against the identified risks
  • maintenance of a repository detailing information on advertisements presented on online interfaces
  • development of a new crisis response mechanism, allowing the Commission to compel certain conduct from VLOPs and VLOSEs in extraordinary circumstances
  • external and independent auditing, enhanced transparency and public accountability
  • provision to users of the choice not to have recommendations based on profiling
  • the sharing of data with authorities and researchers to enable them to understand how online risks evolve

The DSA contemplates various voluntary standards and codes of conduct to support the implementation of the DSA, which will be welcome. However, X’s (formerly Twitter) withdrawal from the EU voluntary Code of Practice against Disinformation suggests the regulators will need to actively enforce their new legislative weapon to ensure it is credible.

As online intermediary service providers that are not VLOPs and VLOSEs start to assess their obligations under the DSA, it is important to remember that this law is one of a series of laws affecting the technology sector recently in force or in the legislative train. These laws include the draft EU AI Act (see our latest article here), the Platform to Business Regulations (regulating online marketplaces), the Omnibus Directive (modernising the EU’s consumer laws), the Digital Copyright Directive (updating EU copyright laws for the digital age), the GDPR (regulating the processing of personal data), and the draft Data Act (see our summary of legislation to watch here). These laws are intended to be symbiotic, and businesses need to adopt a governance program that assesses their obligations across these law in a holistic and harmonised manner.