EU Proposes Legislation Targeting Online Child Sexual Abuse Material

The new regulation would apply alongside the Digital Services Act
August.10.2022

On 11 May, 2022, the European Commission announced a proposed regulation aimed at combating and preventing the sexual abuse of children online (the Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse, or the “Regulation”). The use of online services to perpetrate crimes against children has increased over the past few years and the Commission considers that voluntary preventive measures adopted by online service providers have proven insufficient to address the misuse of their services for these purposes.

The proposed Regulation seeks to establish a harmonized legal framework for preventing and combating online child sexual abuse by providing legal certainty to service providers regarding their responsibilities to assess, mitigate, detect and report the use of their services for abuse.

At the end of July, the European Data Protection Board (“EDPB”) and the European Data Protection Supervisor (“EDPS”) issued a joint opinion regarding the proposal, noting in particular that it “raises serious concerns” regarding the necessity and proportionality of the interference and limitations to the protection of privacy rights and the protection of personal data.

This note sets out a summary of some of the draft Regulation’s central obligations.

Key Takeaway:

  • The draft Regulation is at an early stage of the legislative process, and its content will evolve. However, once adopted, it will apply in parallel to the EU Digital Services Act (“DSA”), and cover a large number of different online service providers. Companies that are developing their DSA compliance strategies should have this on their horizon.

Who is covered:

The rules will apply to hosting services (as defined by the EU Digital Services Act), interpersonal electronic communications services (such as email), software applications and software application stores (both as defined by the EU Digital Markets Act), and internet access services.

If a covered service has an establishment in Europe, or targets its activities towards one or more Member States, then it will have to comply with the Regulation. Non-EU covered services will also be obligated to appoint a European representative to serve as a point of contact for regulators.

Key Provisions:

This is a non-exhaustive summary of the some of the proposed obligations:

  • Mandatory Risk Assessment. Hosting services and providers of interpersonal electronic communications services will be required to conduct a risk assessment relating to the use of their services for ‘online child sexual abuse’ (this covers the online dissemination of child sexual abuse material and the solicitation of children for sexual purposes). This assessment is to be conducted after the Regulation comes into force and then at least once every three years.
  • Risk Mitigation Measures. Hosting services and providers of interpersonal electronic communications services will have to adopt mitigation measures that are targeted to address the identified risks. No specific measures are set out in the draft Regulation, but they must be targeted, proportionate, applied in a nondiscriminatory manner and effective. The same parties will also have to submit regular risk reports to the nominated national authority.
  • Obligations of Software App Stores. App stores will have to make reasonable efforts to assess, if possible with the relevant application provider, whether each service offered through the platform presents a risk of being used for the purpose of the solicitation of children. They will also have to adopt reasonable measures to prevent child users (under 17 years of age) from accessing the software applications in relation to which they have identified a significant risk and use the necessary age verification and age assessment measures to reliably identify child users on their services.
  • Detection Orders. National authorities will be empowered to issue orders to hosting service providers and/or providers of interpersonal electronic communications services to take measures to detect online sexual abuse on a specific service – subject to certain control and oversight measures (for instance, there must be a “significant” risk identified and in accordance with formal requirements set out in the Regulation). Parties that receive a Detection Order will be required to execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material, or the solicitation of children.
  • Reporting Obligations. Hosting services and/or providers of interpersonal communications services that becomes aware of any information indicating potential online child sexual abuse on its services must submit a report to the EU Centre (see below), containing certain mandatory information. The service provider must also inform the user concerned, including in relation to the user’s right to complain to the national authority.
  • Preservation of Information. Providers of hosting services and providers of interpersonal communications services must preserve the content data and other data processed in connection to the measures taken to comply with the Regulation and the personal data generated through such processing, for 12 months or longer if so ordered by the national authority or a court.
  • Victim’s Rights. The Regulation recognizes a victim’s right to information from the national authority regarding whether the dissemination of known child sexual abuse material depicting them has been reported to the EU Centre. Victims will also have a right of assistance and support for removal of such content.
  • New European Agency. The Regulation provides for the creation of the EU Centre on Child Sexual Abuse, responsible for overseeing implementation of the Regulation and facilitating the cooperation and coordination of national authorities. The EU Centre will be based in The Hague, in the Netherlands.
  • Enforcement & Sanctions. National authorities can apply to courts for removal and blocking orders. Penalties for noncompliance will be set by the Member States, but fines shall not exceed 6% of annual income or global turnover in the preceding year.

Responses to the Regulation:

A number of civil society organizations involved in the protection of children’s rights have welcomed the proposed legislation, calling it “timely and historic.” However, in addition to the joint opinion of the EDPB and the EDPS, concerns have been raised by privacy associations regarding the potential privacy and security implications associated with technological measures adopted to comply with detection orders.

Next Steps:

The text of the regulation is open to public consultation until late August.