On 16 November 2022, EU Regulation 2022/2065, better known as the Digital Services Act ("DSA"), came into force. The DSA is a key development in the use of online services in the European Union ("EU"), with an impact on online services as significant as the one which the General Data Protection Regulation ("GDPR") had upon the collection, use, transfer, and storage of data originating in the EU on 25 May 2018.

Ambit

The DSA sets out rules and obligations for digital services providers that act as intermediaries in their role of connecting consumers with goods, services, and content.

Its goal is to regulate and control the dissemination of illegal or harmful content online, provide more consumer protection in online marketplaces, and to introduce safeguards for internet users and users of digital services. It also introduces new obligations for major online platforms and search engines to prevent such platforms being abused.

The DSA applies to a wide range of providers of:

(a) Intermediary services offering network infrastructure such as internet access providers, domain name registrars, and other providers of what is described as 'mere conduit' or 'caching' services;

(b) Hosting services such as cloud and webhosting services;

(c) Online platforms bringing together sellers and consumers such as online marketplaces, app stores, collaborative economy platforms and social media platforms; and

(d) Very large online platforms and very large online search engines that are used to disseminate content and information.

The DSA applies in the EU, and to those providers outside the EU that offer their services in the EU. If a provider is not established in the EU, they will have to appoint a legal representative within the EU.

The DSA splits providers into tiers. The most heavily regulated tier covers Very Large Online Platforms ("VLOP"s) and Very Large Online Search Engines ("VLSE"s). The main criteria that will bring a provider under the scope of the DSA as a VLOP or VLSE is whether it operates a platform servicing more than 45 million monthly active end users located in the EU.

Features

The DSA introduces:

  • Mechanisms allowing users to flag illegal content online, and for platforms to cooperate with specialised 'trusted flaggers' to identify and remove illegal content;
  • New rules to trace sellers on online marketplaces and a new obligation by online marketplaces to randomly check against existing databases whether products or services on their sites comply with the law;
  • Safeguards on moderation of content on platforms, giving users a chance to challenge platforms' content moderation decisions when their content gets removed or restricted;
  • Transparency on the algorithms used for recommending content or products to users;
  • New obligations to protect minors on any platform in the EU;
  • A requirement for VLOPs to mitigate abuse of their systems which could lead to, for example, disinformation or election manipulation, cyber violence against women, or harm to minors online;
  • Bans on targeted advertising on online platforms profiling children or based on special categories of personal data such as ethnicity, political views or sexual orientation;
  • A ban on using 'dark patterns' within the interfaces deployed by online platforms, which appears to impact systems which might manipulate users into making choices they do not intend to make; and
  • New rights for users, including a right to complain to a platform, seek out-of-court settlements, complain to their national authority in their own language, or seek compensation for breaches of the rules. Representative organisations will be able to defend user rights for large scale breaches of the law;

Obligations

The DSA takes an asymmetric approach to obligations setting them out based on the category or type of services provided. Providers are matched with different and cumulative sets of obligations, commensurate with their role and significance within the digital services market.

The first set of obligations applies to all providers of intermediary services, and includes requirements to:

  1. Establish two single points of contact for direct communication with regulatory authorities and their users respectively;
  2. Designate a legal representative within the EU if they are a foreign-established provider. The representative will be held responsible for a provider's non-compliance, without prejudice to the provider's own liability for non-compliance;
  3. Set out clear terms and conditions on any restrictions that the provider may apply on the services provided, such as policies, procedures, or tools engaged in content moderation, with additional reporting obligations related to provider's type or category; and
  4. Publish annual transparency reports regarding any content moderation activities the provider engaged in, with additional reporting obligations related to their type or category.

The second set of obligations then applies to all hosting services, including all online platforms, where providers must:

  1. Implement notice and action mechanisms. The mechanism must allow users to notify the provider of alleged illegal content alongside supporting information. Notices under this mechanism will serve as actual notices for the provider, which may override certain limitations to liability or shields granted under the DSA; and
  2. Report criminal offences to law enforcement or judicial authorities of the EU relevant Member State(s) when the provider has information that indicates a criminal act threatening a person's life or safety will take place.

The third set of obligations applies to online platforms, and include requirements to:

  1. Publish monthly active user figures averaged over 6 months at least once every six months on their platform. This obligation is extended to online search engines;
  2. Implement a complaint and redress system for users regarding the provider's decisions on content moderation;
  3. Engage and abide, in good faith, with out-of-court dispute settlement bodies certified under the DSA;
  4. Prioritize notices filed by trusted flaggers of content as designated under the DSA;
  5. Implement measures and protections against misuse, including systems for complaint handling, warning, review, and suspension;
  6. Publish annual transparency reports, with greater reporting obligations, including information on out-of-court dispute settlements and reports on suspensions or other protections against misuse;
  7. Follow Commission guidelines regarding interface design and advertising;
  8. Adhere to advertising rules under the DSA, by presenting transparency information regarding the advertiser, allowing users to declare whether submitted content is a commercial communication, and abstain from using special categories of personal data to target advertisements to users;
  9. Protect minors by abstaining from targeted advertising towards minors using personal data, and implementing appropriate measures as may be issued by the Commission.

In addition to the above, VLOPs and VLSEs are then subject to more onerous requirements to:

  1. Implement risk management and crises response mechanisms, including the appointment of a compliance officer;
  2. Share data with the relevant authorities and researchers;
  3. Adhere to codes of conduct pertaining to data access, transparency, and advertising as well as pay a supervisory fee; and
  4. Adopt external, independent risk and accountability measures.

The Commission will also further consult and develop applicable standards, codes of conduct, and crisis protocols where applicable, for further clarification of the Providers' obligations.

What is illegal?

The DSA sets out EU-wide rules that cover detection, flagging and removal of illegal content, as well as a new risk assessment framework for VLOPs and VLSEs on how illegal content spreads on their services. However, and crucially, the DSA does not define what is considered "illegal".

Rather, what constitutes illegal content is defined by other laws either at the EU level or at individual Member State (national) level. For example, terrorist content, child sexual abuse material or illegal hate speech is defined at the EU level and accordingly illegal across the whole EU. When content is illegal only in a Member State, as a general rule it should only be removed by Providers in the territory where it is illegal and not across the whole EU.

Enforcement

A similar approach is taken to enforcement, with a mechanism which consists of Member State (national) level and EU level cooperation, which will supervise how providers adapt their systems to the new requirements. The supervision of the rules will be shared between the Commission – primarily responsible for VLOPs and VLSEs – and Member States, responsible for any smaller providers based in or operating out of their Member State.

Penalties

The DSA has a new enforcement mechanism, again consisting of Member State (national) level and EU level cooperation, which will supervise how providers adapt their systems to the new requirements. The supervision of the rules will be shared between the Commission – primarily responsible for VLOPs and VLSEs – and Member States, responsible for any smaller providers based in or operating out of their Member State.

Member States will have to designate competent authorities – Digital Services Coordinators – by 17th February 2024, an independent authority who will be responsible for supervising the providers based in their Member State and to participate in the EU cooperation mechanism of the DSA. Each Digital Services Coordinator will have the authority to carry out investigations, conduct audits, accept undertakings or commitments from the providers on how they will remedy infringements, and impose penalties, including financial fines.

Each Member State will clearly specify the penalties in their national laws in line with the requirements set out in Article 52 of the DSA, meaning that the maximum fines they can include within their own laws are as follows:

  • failure to comply with an obligation shall be a maximum fine of 6% of the annual worldwide turnover of the infringing provider concerned in the preceding financial year; and
  • failure to supply correct, complete, or accurate information, failure to reply or rectify incorrect, incomplete, or misleading information and failure to submit to an inspection shall be a maximum amount of 1% of the annual income or worldwide turnover of the infringing provider or person concerned in the preceding financial year.

Alternatively, Member States may impose periodic penalty payments up to a maximum amount of 5% of the average daily worldwide turnover or income of the infringing provider concerned in the preceding financial year per day, calculated from the date specified in the decision concerned.

For VLOPs and VLSEs, however, the Commission has sole and direct supervision and enforcement powers and can, in the most serious cases, impose fines of up to 6% of the annual worldwide turnover of the infringing VLOPs or VLSEs.

Note that in relation to the 6% maximum fine this is a higher fine that than which can be imposed for a breach of the GDPR.

Impact

Whilst the DSA is already in force, it shall not be directly applicable across the EU until 17 February 2024. In the months leading up to this date EU Member States must empower their national authorities to enforce the DSA's rules. For VLOPs and VLSEs, which are supervised directly by the Commission, the impact will be felt sooner. For example, all online platforms which are not considered small enough to escape the rules must publish data on the number of active monthly users by 17 February 2023.

As for the law itself, it is a clear reaction to concerns that online platforms can be used to spread misinformation or for illegal purposes, which is not unique to the EU. It also seems that introduction of the law was accelerated by concerns over the involvement of bad actors in election campaigns as well as concerns in relation to the impact online platforms and specifically social media may have upon minors and society in general.

When the GDPR came into force, many other jurisdictions such as Singapore and China passed or amended laws to adopt broadly similar provisions; it remains to be seen whether the same could happen with the DSA. That said, given the increasing global concern as to how major social media platforms can be misused, one would expect other jurisdictions to be watching the rollout of the DSA with interest.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.