The Four “W”s and One “H” of Europe's Digital Services Act

13 minute read
May.19.2022

An update to this earlier article was published in October 2022 following approval of the Digital Services Act (DSA) by the Council of the EU Member States.

After more than a year of negotiations the final text of Europe’s (EU) Digital Services Act (“DSA”) has been agreed upon by the EU Parliament, the French Presidency of the Council of the EU, and the European Commission (“EU Commission”).

This update provides a quick summary and a deeper dive into who is impacted by the DSA, and the key obligations imposed to help organisations that qualify as EU-oriented intermediary service providers identify internal processes and products that will need to be adapted in order to achieve compliance.

The final agreed-upon text of the DSA has not been made public. The Orrick team is actively monitoring developments and this update will be amended once the final text is available.

Quick Summary

What Is the DSA?

The DSA updates and builds on the eCommerce Directive 2000/31/EC. Its dual objective is the creation of a European digital space where the rights of consumers and professional users are protected, and the establishment of a level playing field that will foster growth and innovation.

To achieve these objectives, the DSA imposes new obligations on intermediary service providers to identify and adopt measures to prevent the dissemination of “illegal” online content. The DSA also contains rules, as described in more detail below, regarding mandatory information to be provided to both consumers and professional users, in the service provider’s terms of use.

Importantly, the DSA does not replace other EU legislation that imposes rules regarding online content, including:

What Businesses Are Impacted and What Initial Steps Should They Take?

  • Online Service Providers: Online service providers that target EU users should assess whether they are a provider of “intermediary services”, and if so, which of the DSA obligations will be applicable to their business.
  • Intermediary Service Providers That Are Online Platforms: Intermediary service providers in the form of “online platforms”, even those that are not especially large in terms of users, are particularly impacted since most of the obligations in the DSA apply to online platforms. Such platforms will need to implement policies, internal processes, and new functionalities, in order to comply. The so-called Very Large Online Platforms (VLOPs) impacted by the DSA are already taking steps to implement their compliance.
  • Advertisers: Advertisers should also take note of the new transparency obligations that apply to intermediaries in relation to the ads displayed on their services.

Deeper Dive Into the DSA – An Overview of the Digital Services Act

1. Application

The DSA has a particular structure in terms of the applicability of its obligations as shown in the graphic below. All the law’s obligations apply to providers “intermediary services”. A subset of such obligations apply to “hosting services, including online platforms”. A further subset of obligations apply to “online platforms”, and still a further subset of obligations apply only to “very large online platforms".

The Four “W”s and One “H” of the European Commission’s Digital Services Act

Intermediaries Under the DSA Include:

  • Mere conduit services such as internet services providers.
  • Caching services such as cloud services providers offering automatic, intermediate and temporary storage of information, for the sole purpose of making the information's onward transmission more efficient.
  • Hosting services which are a storage of information services (e.g., cloud services providers, webhosting). Hosting services include:
    • Online platforms: Which are hosting services that store and disseminate information to the public, at the request of service users, as their primary activity (e.g., marketplaces, app stores, collaborative economy platforms and social media platforms). Providers of hosting services that disseminate such information as a merely or ancillary service feature will not be considered online platforms (for example, the comments section in an online newspaper).
    • VLOPs: The very large online platforms are those online platforms that provide for at least four consecutive months of their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million.

2. Territorial Scope

The DSA applies to intermediary services that are provided to users that are established or have their residence in Europe, without regard to the where the intermediary is established. Non-EU based intermediaries that have EU-based users will therefore be expected to comply.

3. Intermediary Liability

The eCommerce Directive enshrined the principle that intermediaries shall not be liable for information transmitted through its service, provided it was not actively involved in the transmission  and/or acted to remove or disable access to the information upon receiving notice.

The DSA retains this exemption from liability for intermediaries, with slight modifications, but imposes on hosts (and the subset categories of online platforms and very large online platforms) a set of due diligence requirements in relation to illegal content, as described below.

4. Obligations

Note: This section will be reviewed and may be subject to change when the final text of the law is published.

A. Applicable to All Intermediaries

Intermediary service providers will be required to do the following:

  • Identify a single point of contact within their organisation who will be the point of contact for national authorities. Intermediaries that do not have an establishment in the EU will have to appoint a legal representative in a Member State where the intermediary offers its services (there may be a possibility of collective representation for micro, small and medium enterprises).
  • There are specific obligations in relation to the form and content of the intermediary service terms and conditions. For instance, the terms must be fair, non-discriminatory and transparent, and must include information regarding how to terminate services, restrictions imposed on the delivery of services and also regarding the use of algorithmic tools for content-moderation.
  • Anonymity of users is protected except in relation to traders.
  • All intermediary services to publish an annual transparency report on any content moderation then engaged in, including specified information such as the number of orders received from Member States’ authorities, and information about the own-initiative content moderation practices of the service, including the use of automated tools. (This obligation does not apply to micro or small enterprises that do not qualify as very large online platforms).

B. Applicable Only to Intermediaries That Are Hosting Services, Including Online Platforms [Article 14 et. seq.]

  • Hosting services shall have a notification mechanism allowing the signaling of content considered by a user considers to be illegal content. The mechanism must be designed to facilitate sufficiently precise and substantiated notices to permit the identification of the illegality of the content.
  • ‘Illegal content’ means any information or activity, including the sale of products or provision of services which is not in compliance with EU law or the law of a Member State, irrespective of the precise subject matter of that law;
  • If a notice containing the required information is received, the hosting service will be deemed to have actual awareness of the content and its potential illegality (which has implications for the service’s liability).
  • Hosting services must provide a statement of reasons to the user if their content is disabled or removed. This explanation, must contain certain information, including the facts relied upon and a reference to the legal ground relied upon, or other basis for the decision if it was based on the host’s terms and conditions. However, law enforcement authorities may request that no explanation is provided to users, if necessary due to an ongoing investigation.
  • There is a positive obligation to alert law enforcement or judicial authorities if the host suspects that a serious criminal offence involving a threat to life or safety of persons is taking place or is planned.
  • The anonymity of the content reporter is to be protected, except in relation to reports involving alleged violation of image rights and intellectual property rights.
  • The use of “dark patterns” is prohibited. Specific design practices are prohibited by the DSA (such as giving more prominence to a user consent option), and more may be identified through further legislation.

C. Applicable Only to Hosting Services That Are Online Platforms [Article 16 et seq.]

  • The obligations in this section do not apply to micro or small enterprises, except if they qualify as very large online platforms. Intermediary services may apply to be exempted from the requirements of this section of the DSA.
  • Online platforms shall provide an online appeals process against decisions taken by the platform in relation to content that is judged to be illegal or in breach of the platform’s terms and conditions. Decisions shall not be fully automated and shall be taken by qualified staff.
  • Users will be able to refer decisions to an out-of-court dispute settlement body certified by the Digital Services Coordinator of the relevant Member State.
  • Content reported by trusted flaggers shall be processed with priority and without delay. An entity may apply to the Digital Services Coordinator to be designated as a trusted flagger, based on criteria set out in the DSA.
  • The suspension of users, for a reasonable period of time, is permitted if they repeatedly upload illegal content shall be suspended for a reasonable period of time, after issuing a prior warning. Online platforms shall also suspend the processing of notices and complaints from users that repeatedly submit unfounded notices and complaints.
  • Online platforms are required to ensure that their services meet the accessibility requirements set out in the EU Directive 2019/882, including accessibility for persons with disabilities, and shall explain how the services meet these requirements.
  • To ensure the traceability of traders, online marketplaces shall only allow traders to use their platform if the trader first provides certain mandatory information to the platform, including: contact details, an identification document, bank account details, details regarding the products that will be offered.
  • A trader who has been suspended by an online platform may appeal the decision using the online platform’s complaint handling mechanism.
  • Online platforms shall also provide additional information in their transparency reports regarding the number of complaints received, the number of disputes submitted to the out-of-court dispute settlement bodies, the number of suspensions, the number of advertisements removed (with reasons) and use made of automated content moderation tools.
  • To promote online advertising transparency, online platforms shall ensure that service users receive the following information regarding online ads: that the content presented to users is an advertisement, the identity of the advertiser or person that has financed the advertisement, information regarding the parameters used to display the ad to the user (and information about how a user can change those parameters).
  • Users shall also receive information about how their data will be monetized.
  • Refusing consent must be as easy for the user as giving consent to a service or functionality. If consent is refused, users shall be given other fair and reasonable options to access the online platform.
  • Targeting techniques that involve the personal data of minors or sensitive personal data (as defined under the GDPR) is prohibited.
  • Online platforms have transparency obligations regarding any recommender system that is used to promote content. The online platform must disclose the main parameters used, as well as options for the recipient to modify or influence the parameters.
  • Specific obligations for online platforms that disseminate user generated pornographic content include mandatory verification of users that post content; professional human content moderation; a notification procedure specific for the take-down of content posted without the consent of a person depicted in the content.

D. Applicable Only to Online Platforms Which Are Very Large Online Platforms [Articles 25 et seq.]

  • The obligations in this section apply only to online platforms that provide their services to 45 million or more monthly active users, calculated in accordance with a methodology to be set out in further legislation. The European Commission will designate the online platforms that qualify as Very Large Online Platforms (“VLOPs”), and the list will be published.
  • VLOPs shall carry out (and in any event before launching a new service), an annual risk assessment of their services. The risk assessment shall take into account in particular risks of dissemination of illegal content, negative effects for the exercise of the fundamental rights, amplification of illegal content due to a malfunctioning or manipulation of their services, negative effects on public health. VLOPs shall consult with user representatives, independent experts and civil society organisations.
  • VLOPs must implement mitigation measures to deal with these system risks.
  • VLOPs shall have independent audits carried out, by independent firms, to assess their compliance with the DSA and any commitments undertaken pursuant to a code of conduct.
  • VLOPs that use recommender systems must provide at least one that is not based on profiling and must provide users with functionality to allow them to set their preferred options for content ranking.
  • Additional advertising transparency obligations are applicable, requiring the publication of information regarding the advertisements that have been displayed on the platform, including whether the advertisement was targeted to a group, the relevant parameters and the total number of recipients reached.
  • Content that qualifies as a “deep fake” shall be clearly labeled.
  • VLOPs are required to share data with authorities, where necessary for them to monitor and assess compliance with the DSA. Such information might include explanations of the functioning of the VLOPs algorithms. The regulator may also require that VLOPs allow “vetted researchers” (those that satisfy the DSA’s requirements) to access data, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks.
  • European legislators agreed to the last-minute insertion of a crisis mechanism, in the form an obligation on VLOPs to implement mitigating measures if a public crisis occurs. The text of this addition is not yet public.
  • VLOPS shall appoint a compliance officer responsible for monitoring their compliance with the DSA.

5. Enforcement and Sanctions [Articles 38 et seq.]

A. Digital Services Coordinator - Designation and Powers

  • Each Member State shall designate one or more competent authorities as responsible for the application and enforcement of the DSA, and one of these authorities shall be appointed by the Member State as its Digital Services Coordinator. This Digital Services Coordinator will be the main enforcement authority. For non-EU based intermediaries, the competent Digital Services Coordinator will be located in the Member State where these intermediaries have appointed their legal representative. If no legal representative has been designated, then all Digital Services Coordinators will be competent.
  • Digital Services Coordinators are granted investigation and enforcement powers—in particular the power to accept intermediary services’ commitments to comply with the DSA, order cessation of infringements, impose remedies, fines, and periodic penalty payment.
  • Users have the right to lodge a complaint against providers of intermediary services alleging an infringement of the DSA with the Digital Services Coordinator of the Member State where the recipient resides or is established.

B. Sanctions

  • Temporary access restrictions. Where enforcement measures are exhausted, and in case of persistent and serious harm, the Digital Services Coordinator may request that the competent judicial authority of the Member State order the temporary restriction of access to the infringing service or to the relevant online interface.
  • Fines. Sanctions shall be “effective, proportionate and dissuasive”. Member States shall ensure that the maximum number of penalties imposed for a failure to comply with the provisions of the DSA shall not exceed 6% of the annual income or turnover of the intermediary. The maximum amount of a periodic penalty payment shall not exceed 5% of the average daily turnover of the provider in the preceding financial year per day.

What's Next?

Once the final text of the DSA is published, this note will be updated. We will also be publishing more detailed information regarding the other key content-related obligations that will continue to apply alongside the DSA, to intermediaries operating in the French market.

If you have any questions, please do not hesitate to contact our team.