12 minute read | November.06.2023
The UK’s new Online Safety Act (the “OSA”) and the EU’s Digital Services Act (the “DSA”) both try to make the internet safer by focusing on an online service provider’s systems and processes. But differences in the laws will require online service providers targeting EU and UK users to assess which requirements from each law apply to them.
In the UK, communications regulator Ofcom will enforce the OSA. This month, Ofcom plans to publish an initial draft code of conduct and guidance in relation to illegal harms, and begin a consultation regarding the same, with final versions of these documents expected in late 2024. Later, in spring 2024, Ofcom is expected to publish an initial draft code of conduct and guidance in relation to the protection of children online, pornography and protecting women and girls (and begin a consultation regarding the same), with final versions of these documents expected in 2025. Alongside this, we also expect Ofcom, in conjunction with the UK’s Secretary of State, to issue guidance in relation to the categorisation of Services (as defined below).
The path to compliance with both the OSA and DSA is long, but as part of our ongoing coverage of the changing legislative landscape with regards to online safety and children’s privacy, here’s a look at the key similarities and differences between the OSA and the DSA (our EU DSA Readiness Assessment Tool helps companies assess whether they are subject to the DSA – and how best to comply).
UK Online Safety Act | EU Digital Services Act | |
Aim of the legislation |
The OSA seeks to ensure that “user-to-user services” and “search services” (together “Services”) take practical steps to ensure their terms of service adequately protect children and adults online (for example, by moderating the content published on their platforms). The legislation does not focus on eliminating content and/or removing risk altogether – it focuses on measures companies can implement to make their services safer for adults and children (those under the age of 13). |
The DSA aims to protect against the spread of illegal content and protect users’ fundamental rights. In this respect, not only does it implement online safety legislation, but also related consumer law protections, for example where an online platform allows consumers to conclude distance contracts with traders. |
Application |
The OSA applies to all Services. “User to user services” are internet services whereby content on the service (whether automatically generated or uploaded/shared by a user) may be accessible by another user (e.g., social media sites, online marketplaces, consumer cloud storage services). “Search services” are search engines. The following services are exempt under the OSA: email services, SMS / MMS services, services that offer only live aural communications, services where a user is limited to posting or sharing a comment/review (or expressing a view on such a comment/review) (e.g., news publication websites), internal business services, services provided by public bodies and services provided in the course of education or childcare. Some Services will be categorised as either Category 1, Category 2A, or Category 2B, depending on the number of users and functionality of the service. The Secretary of State has yet to publish the criteria for each category of Service, however, we understand that Category 1 Services will include those with the largest number of users, to which more stringent regulation will apply. |
The DSA applies to online intermediary services such as hosting services (including online platforms like social media sites, online marketplaces, and cloud storage services), as well as caching services, mere conduit services and online search engines. The European Commission designates companies as “very large online platforms” and “very large online search engines” if they have 45 million or more average monthly active users. The European Commission has designed 17 “very large online platforms” and two “very large search engines.” (Note that a “very large” designation will not necessarily translate into a “Category 1 Service” under the UK Online Safety Act and vice versa.) |
Territorial Scope |
Applies to Services that have “links to the United Kingdom,” i.e., if:
Services may also be regarded as having links to the UK if they are capable of being used in the UK by individuals and reasonable grounds exist to believe that people in the UK face a material risk of significant harm from user-generated content or search content (as applicable), which is available on the Service. |
Applies to online intermediaries that offer services to individuals and entities located/established in the EU (regardless of whether the online intermediary is established and/or based in the EU). |
Covered content |
The OSA differentiates between the following:
|
“Illegal content” includes any information or activity (including the sale of products or provision of services) that does not comply with EU law or the law of a Member State, irrespective of the precise subject of that law. The DSA therefore applies to content, as well as physical goods and commercial practices (including the sale of products and provision of services). The DSA does not cover harmful content. However, the threshold for what constitutes illegal content differs among the Member States. |
Overview of duties |
The OSA contains a variety of duties, the application of which will vary according to whether a Service is high-risk and likely to be accessed by children (i.e., Category 1, 2A or 2B). Design and operationAll Services must use proportionate measures relating to the design and/or operation of the service to:
Systems and processes All Services must use proportionate systems and processes to:
Services must also ensure that the following are designed to help protect individuals online:
Terms of service and risk reporting The OSA focuses in part on consumer protection. For example, a Service must ensure that its terms of service are applied consistently. Terms must also:
All Services must also carry out and publish risk assessments and transparency reports related to illegal and harmful content on their services. Category 1 Services must also summarise the findings of these risk assessments in their terms of service. |
The DSA imposes a number of duties, the application of which varies according to the category and size of the online intermediary (“very large online platforms” and “very large online search engines” have the most obligations). Design and operation Key obligations on intermediary services include:
Systems and processes Key obligations on intermediary services include:
Terms of service and risk reporting Key obligations on intermediary services include:
|
Child users |
Services must ensure children (those under 18) do not access their platforms and are protected from the risk of harm arising out of using the Service. Additionally, all Services must:
|
All intermediary services directed primarily at children (under 18):
|
Online Advertising |
Category 1 and Category 2A Services must:
|
Online platforms must ensure users receive information in relation to online advertising that:
Online platforms must provide users with functionality that enables them to declare their content a “commercial communication.” “Very large online platforms” and “very large search engines” must also provide a user with the following information:
|
Criminal Offences |
The OSA has created the following new criminal offences for posters of content (not the Services or their directors/employees):
|
The DSA does not create any criminal offences. However, online intermediaries will not be able to benefit from the law’s exemption from liability in relation to illegal content if they have actual knowledge of the illegal content and fail to act against it. |
Enforcement |
Ofcom will have the power to fine Services up to £18 million or 10% of their annual global turnover, whichever is greater. In the most extreme cases, with the agreement of the courts, Ofcom will be able to require payment providers, advertisers, and internet service providers to stop working with a Service, preventing it from generating money or being accessed from the UK. |
Each EU Member State will appoint a Digital Services Coordinator with responsibility for supervising intermediary services established in their Member State (or the European Commission in relation to “very large online platforms” and “very large search engines”). The Digital Services Coordinators (or the European Commission where applicable) will be able to act against intermediary services by way of fines of up to 6% of global annual turnover, and for the most serious contraventions, a temporary suspension of service. Service recipients (or an association mandated on their behalf) may file complaints against intermediary services for breach of the DSA with the relevant Digital Services Coordinator. Service recipients also have a right to seek compensation for loss or damages (subject to national Member State law). Users are also able to seek compensation from intermediary services due to loss suffered from a provider’s non-compliance of its obligations under the DSA. |