The UK’s Online Safety Act and EU’s Digital Services Act: What Online Service Providers Should Know

12 minute read | November.06.2023

The UK’s new Online Safety Act (the “OSA”) and the EU’s Digital Services Act (the “DSA”) both try to make the internet safer by focusing on an online service provider’s systems and processes. But differences in the laws will require online service providers targeting EU and UK users to assess which requirements from each law apply to them.

In the UK, communications regulator Ofcom will enforce the OSA. This month, Ofcom plans to publish an initial draft code of conduct and guidance in relation to illegal harms, and begin a consultation regarding the same, with final versions of these documents expected in late 2024. Later, in spring 2024, Ofcom is expected to publish an initial draft code of conduct and guidance in relation to the protection of children online, pornography and protecting women and girls (and begin a consultation regarding the same), with final versions of these documents expected in 2025. Alongside this, we also expect Ofcom, in conjunction with the UK’s Secretary of State, to issue guidance in relation to the categorisation of Services (as defined below).

The path to compliance with both the OSA and DSA is long, but as part of our ongoing coverage of the changing legislative landscape with regards to online safety and children’s privacy, here’s a look at the key similarities and differences between the OSA and the DSA (our EU DSA Readiness Assessment Tool helps companies assess whether they are subject to the DSA – and how best to comply).

  UK Online Safety Act EU Digital Services Act
Aim of the legislation
The OSA seeks to ensure that “user-to-user services” and “search services” (together “Services”) take practical steps to ensure their terms of service adequately protect children and adults online (for example, by moderating the content published on their platforms). The legislation does not focus on eliminating content and/or removing risk altogether – it focuses on measures companies can implement to make their services safer for adults and children (those under the age of 13).
The DSA aims to protect against the spread of illegal content and protect users’ fundamental rights. In this respect, not only does it implement online safety legislation, but also related consumer law protections, for example where an online platform allows consumers to conclude distance contracts with traders.

The OSA applies to all Services.

User to user services” are internet services whereby content on the service (whether automatically generated or uploaded/shared by a user) may be accessible by another user (e.g., social media sites, online marketplaces, consumer cloud storage services).

Search services” are search engines. The following services are exempt under the OSA: email services, SMS / MMS services, services that offer only live aural communications, services where a user is limited to posting or sharing a comment/review (or expressing a view on such a comment/review) (e.g., news publication websites), internal business services, services provided by public bodies and services provided in the course of education or childcare.

Some Services will be categorised as either Category 1, Category 2A, or Category 2B, depending on the number of users and functionality of the service. The Secretary of State has yet to publish the criteria for each category of Service, however, we understand that Category 1 Services will include those with the largest number of users, to which more stringent regulation will apply. 

The DSA applies to online intermediary services such as hosting services (including online platforms like social media sites, online marketplaces, and cloud storage services), as well as caching services, mere conduit services and online search engines.

The European Commission designates companies as “very large online platforms” and “very large online search engines” if they have 45 million or more average monthly active users. 

The European Commission has designed 17 “very large online platforms” and two “very large search engines.”  (Note that a “very large” designation will not necessarily translate into a “Category 1 Service” under the UK Online Safety Act and vice versa.)

Territorial Scope

Applies to Services that have “links to the United Kingdom,” i.e., if:

  • they have a significant number of users in the UK.; and
  • the UK is one of their target markets (or the only market).

Services may also be regarded as having links to the UK if they are capable of being used in the UK by individuals and reasonable grounds exist to believe that people in the UK face a material risk of significant harm from user-generated content or search content (as applicable), which is available on the Service. 

Applies to online intermediaries that offer services to individuals and entities located/established in the EU (regardless of whether the online intermediary is established and/or based in the EU).
Covered content

The OSA differentiates between the following:

  • Illegal content” relates to terrorism, child sexual exploitation and abuse, assisting suicide, threatening to kill, public order offences (harassment, stalking and fear or provocation of violence (including hate crime), supply of drugs and psychoactive substances, firearms and other weapons, assisting illegal immigration, sexual exploitation, pornography (including revenge pornography), assisting crime and fraud.
  • Harmful content” is age-inappropriate, relates to pornographic content or other material that does not meet a criminal threshold but promotes, encourages or provides instructions for suicide, self-harm or eating disorders, depicts or encourages serious violence and/or relates to bullying. For content to be “harmful,” it must fall within the Secretary of State’s description of “priority content” for adults. For children, “harmful content” must fall within the Secretary of State’s description of “primary priority,” “priority” or otherwise “present a material risk of significant harm to an appreciable number of children”. These descriptions have not yet been published.

Illegal content” includes any information or activity (including the sale of products or provision of services) that does not comply with EU law or the law of a Member State, irrespective of the precise subject of that law. The DSA therefore applies to content, as well as physical goods and commercial practices (including the sale of products and provision of services).

The DSA does not cover harmful content. However, the threshold for what constitutes illegal content differs among the Member States.

Overview of duties

The OSA contains a variety of duties, the application of which will vary according to whether a Service is high-risk and likely to be accessed by children (i.e., Category 1, 2A or 2B).

Design and operation

All Services must use proportionate measures relating to the design and/or operation of the service to:

  • Prevent individuals from encountering illegal content.
  • Mitigate and manage the risk of someone using the service to facilitate a priority offence.
  • Effectively mitigate and manage the risks of harm to individuals.

Systems and processes

All Services must use proportionate systems and processes to:

  • Minimise the length of time any illegal content is present on the Service.
  • Swiftly take down illegal content when made aware of it.

Services must also ensure that the following are designed to help protect individuals online:

  • Functionalities, algorithms, and other features.
  • Content moderation tools.
  • User empowerment technologies to enable individuals to filter out content and/or non-verified users.
  • User support and reporting mechanisms.
  • Staff policies and practices (as well as any other relevant internal policies).

Terms of service and risk reporting

The OSA focuses in part on consumer protection. For example, a Service must ensure that its terms of service are applied consistently. Terms must also:

  • Explicitly state how individuals are to be protected from illegal content.
  • State whether any proactive technology is being used and if so, explain that technology.
  • Be drafted in a way that is clear and accessible.

All Services must also carry out and publish risk assessments and transparency reports related to illegal and harmful content on their services. Category 1 Services must also summarise the findings of these risk assessments in their terms of service.

The DSA imposes a number of duties, the application of which varies according to the category and size of the online intermediary (“very large online platforms” and “very large online search engines” have the most obligations). 

Design and operation

Key obligations on intermediary services include:

  • Implementation of measures to counter the dissemination of illegal goods, services, or online content.
  • Prohibition on the use of dark patterns (deceptive or manipulative online interfaces).
  • “Very large online platforms” and “and very large search engines” must also analyse the systemic risks arising from the platform.

Systems and processes

Key obligations on intermediary services include:

  • Implementation of a mechanism that allows users to report illegal content through the intermediary service.
  • Adoption of measures to ensure the traceability of online traders.
  • Implementing measures to safeguard users, such as an obligation on online platforms to provide obligatory information to users when their content gets removed or restricted and providing such users with the possibility to challenge content moderation decisions.
  • “Very large online platforms” and “very large search engines” must also offer users a system for recommending content that is not based on profiling.

Terms of service and risk reporting

Key obligations on intermediary services include:

  • Issuing an annual transparency report (or a bi-annual report for “very large online platforms” and “very large search engines”).
  • Providing clear terms of service that include the restrictions on the use of the service, content moderation policies and information regarding algorithmic decision-making.
  • Disclosing the parameters used in their recommender systems.
  • Compliance with far ranging transparency obligations.
  • Providing information to authorities on any actions taken in response to takedowns and disclosure orders.
  • Having a contact to communicate with authorities and users.
  • Appointing a legal representative within the EU (if located outside the EU).
Child users

Services must ensure children (those under 18) do not access their platforms and are protected from the risk of harm arising out of using the Service. Additionally, all Services must:

  • Conduct a child safety risk assessment (note that this needs to be more detailed for Category 1 Services).
  • Have in place measures like age verification to prevent children from accessing harmful content.
  • Provide information regarding child protection measures in their terms of service.

All intermediary services directed primarily at children (under 18):

  • Terms of service must be easy for children to understand.
  • Cannot serve ads based on a user’s personal data if they are reasonably certain the user is a child.
  • Must put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security.
“Very large online platforms” and “very large search engines” must also implement targeted risk mitigation measures to protect the rights of the child (that may include parental control tools and age verification means). 
Online Advertising

Category 1 and Category 2A Services must:

  • Prevent individuals from encountering fraudulent advertisements in or via search results.
  • Minimise the length of time users see fraudulent advertisements.
  • Swiftly ensure individuals no longer encounter fraudulent advertisements if alerted by an individual that such content may be on the platform.
Provide clear and accessible provisions in a publicly available statement as to the proactive technology used to ensure compliance with these duties. 

Online platforms must ensure users receive information in relation to online advertising that:

  • Makes clear that the content is an ad.
  • Identifies the advertiser or sponsor.
  • Details parameters used to display the advertisement, including information on how the user can change those parameters.

Online platforms must provide users with functionality that enables them to declare their content a “commercial communication.”

“Very large online platforms” and “very large search engines” must also provide a user with the following information:

  • Whether the advertisement was targeted.
  • The relevant parameters used, and the number of recipients reached.
Criminal Offences

The OSA has created the following new criminal offences for posters of content (not the Services or their directors/employees):

  • Harmful or false communications, where a person sends a communication that they know to be false, with the intention to cause non-trivial emotional, psychological or physical harm (punishable by up to 51 weeks imprisonment).
  • Threatening communications that pose or convey a threat of serious harm (up to 5 years imprisonment).
  • Cyberflashing, where a person sends unsolicited sexual images via social media or data sharing services (up to 2 years imprisonment).
  • Flashing, aimed at stopping epilepsy trolling (up to 5 years imprisonment).
  • Assisting or encouraging self-harm online.
Companies and senior managers of Services may be criminally liable where a Service fails to comply with Ofcom enforcement notices in relation to specific child safety duties, or in relation to child sexual abuse and exploitation. Ofcom may also take criminal action against senior managers who fail to provide requested information. 

The DSA does not create any criminal offences. However, online intermediaries will not be able to benefit from the law’s exemption from liability in relation to illegal content if they have actual knowledge of the illegal content and fail to act against it.


Ofcom will have the power to fine Services up to £18 million or 10% of their annual global turnover, whichever is greater.

In the most extreme cases, with the agreement of the courts, Ofcom will be able to require payment providers, advertisers, and internet service providers to stop working with a Service, preventing it from generating money or being accessed from the UK. 

Each EU Member State will appoint a Digital Services Coordinator with responsibility for supervising intermediary services established in their Member State (or the European Commission in relation to “very large online platforms” and “very large search engines”). The Digital Services Coordinators (or the European Commission where applicable) will be able to act against intermediary services by way of fines of up to 6% of global annual turnover, and for the most serious contraventions, a temporary suspension of service.

Service recipients (or an association mandated on their behalf) may file complaints against intermediary services for breach of the DSA with the relevant Digital Services Coordinator. Service recipients also have a right to seek compensation for loss or damages (subject to national Member State law).

Users are also able to seek compensation from intermediary services due to loss suffered from a provider’s non-compliance of its obligations under the DSA.