The Digital Services Act Takes Full Effect: 5 Takeaways for Businesses


6 minute read | February.21.2024

Europe's Digital Services Act (DSA) came into effect for very large online platforms and very large online search engines last year and became fully applicable on Saturday 17 February 2024.

As online service providers navigate the DSA and develop their platforms and programs to comply with the new regime, here are five takeaways for businesses to consider.

1. The DSA will impact a variety of businesses - not just Big Tech

The DSA applies to providers that offer "intermediary services" to a significant number of recipients in the EU. This includes hosting services, online platforms, online consumer marketplaces, cloud storage services, caching services, and mere conduit services.

Although the DSA sets out an increasing scale of requirements based on the breadth of a provider's activities (with very large online platforms and very large online search engines being subject to the most extensive obligations), even providers who act as mere conduit or merely cache are caught by the DSA.

This means that businesses will need to be aware of where in the spectrum of the DSA's categorisations they sit, so that they can ensure they comply with the appropriate requirements. You can see our summary of the varying obligations for different types of services providers here. You can also use Orrick’s DSA Readiness Assessment Tool to see which requirements are likely to apply to your business.

2. The DSA isn’t just concerned with content moderation

Effective content moderation is a major aspect of the DSA and will likely be the element that most platform providers will focus on. However, the scope of the DSA's impact goes beyond takedown processes and extends to the design and operation of online services, minimum standards for online safety systems and processes, and transparency and reporting requirements. It also sets out obligations relating to online marketplaces and the identification of, and control over, online advertising. Each of these requirements is made more extensive for any service which is directed primarily at children under the age of 18.

3. The DSA is part of a broader legislative framework

The obligations of the DSA dovetail with or codify a variety of existing compliance requirements that are imposed on service providers operating in the EU, including consumer and privacy laws. Many of the DSA's requirements will therefore require a review and update to existing policies and terms rather than creating entirely new processes in isolation.

For example, existing consumer terms may need review and updating clarify any restrictions on the use of the service, the provider's approach to content moderation and information regarding algorithmic decision-making. Similarly, privacy documentation, including policies and risk assessments, will need to take into account the impact of content moderation that may involve the processing of special categories of personal data and require retention of records for reporting and appeal purposes.

For services that engage in targeted advertising, the age range of their users will gain additional significance, as the prohibition on advertising to individuals under 18 will cover users that can give valid consent under the GDPR to the processing of their personal data for such purposes.

For companies based outside the EU, existing GDPR representatives may be able to perform the role of the representative required to be appointed under the DSA. If that service is unavailable, non-EU businesses may look for alternatives that can consolidate their representation requirements.

4. Penalties for non-compliance are severe but can extend beyond fines

The DSA empowers each EU Member State to appoint a Digital Services Coordinator with oversight of compliance within that Member State. The Digital Services Coordinators will be able to take action against non-compliant businesses with fines of up to 6% of the business' global annual turnover.

While enforcement against very large online platforms and very large online search engines is led by the European Commission under specified procedures, it remains to be seen what stance each Digital Services Coordinators will take over enforcement against other businesses.

While the prospective fines are significant, the reputational risk associated with non-compliance has the potential to be even more severe. Particularly in the case of consumer-facing online platforms that rely on network effect to generate sufficient user numbers to drive revenue, a regulatory decision that effectively labels that platform "unsafe" could have a business-critical impact.

5. The DSA is part of a broader wave of online safety legislation

Online safety is becoming an increasing regulatory focus area around the world, and the DSA is part of a growing trend of governments legislating on online safety.

Legislation designed to enhance online safety has already taken effect in Australia and Singapore, while the UK's Online Safety Act will come into full force gradually over the next two years and will apply to a wide range of services. Although it bears some similarity to the DSA, there are some key differences in approach. See Orrick’s recent comparison of the DSA and the Online Safety Act here for more information.

As a key market for most online businesses, attention will no doubt now be on the U.S., where a raft of federal and state measures are currently at different stages of the legislative process, and where online safety is a useful bandwagon for representatives seeking re-election later this year. We will be posting updates about the various state initiatives; for now, our summary of the current legislation under consideration in Congress can be accessed here and our update on the REPORT Act is available here.

Complying with the DSA is therefore not just a box ticking exercise for a global business's European-facing operations but can form the basis for (or at least tie in to) a broader online safety compliance program that will be relevant for most (if not all) key markets. Although specifics may differ, the core themes and mitigation measures are likely to be broadly transportable. For instance, while there will inevitably be differing approaches to drawing the line between protecting vulnerable internet users from harmful content and protecting freedom of expression, it is a truth universally acknowledged that child sexual abuse material should not be propagated through online platforms. Identifying and focusing on the core themes of online safety legislation and building out good content moderation practices will likely get most companies to a reasonable risk-based position.

If you have questions about this development, reach out to the authors (Kelly Hagedorn and Alex Sobolev) or other members of the Orrick team.