2024 IAPP Global Privacy Summit: Lessons from Enforcers, the AI Landscape and the Future of U.S. Privacy Law


5 minute read | April.17.2024

Over 5,000 privacy professionals from around the world gathered in Washington, D.C. this month for the International Association of Privacy Professionals’ Global Privacy Summit 2024

The conference focused on debating the optimal way of regulating data protection and artificial intelligence, analyzing the flurry of new privacy and AI laws and regulations, predicting the direction of protections for children and other vulnerable groups, operationalizing global data protection and AI practices and considering the litigation risks posed by new technologies and privacy statutes.

We were there to take it all in and can now offer these key areas of emphasis and takeaways:

1. Tensions over proper regulation will lead to choppy waters.

Several sessions and keynote speakers highlighted the ongoing tension over the proper and optimal way to regulate data protection and artificial intelligence. 
While the European Union continues to be the jurisdiction of focus for “doing it right” when it comes to broad regulation, there are concerns that their heavy-handed approach may stifle innovation and lead to regulations that are difficult to enforce. In comparison, the market-driven focus in the United States has led to large technology companies wielding significant power in shaping data protection and AI policy, leading to a renewed focus on the overlap of competition regulation. 

Speakers predicted: 

  • The EU would continue to be the dominant player in shaping data protection and AI regulation with trickle-down effects on laws in other jurisdictions. 
  • Companies will keep an eye on the impact of regulation coming out of Asia as some governments there make a renewed push to gain a foothold in global regulation. 
  • The influence of large tech companies may be dampened in the coming years due to increased enforcement of existing regulation and comprehensive investigations into competition concerns. 

As a result, we recommend companies prepare for an increasingly evolving landscape in the data protection and AI sectors over the next year plus. 

2. Companies should consider leveraging existing organizational frameworks for AI governance.

Unsurprisingly, many sessions this year focused on evolving AI governance requirements and needs abroad and in the United States. Speakers put forth a consistent message — companies should leverage existing organizational and risk management frameworks to develop internal AI governance. For example, several speakers emphasized that existing data privacy and regulated industry model risk management frameworks check many of the boxes for risk-based governance imposed by emerging AI laws. As a result, companies should consider the extent to which they can borrow from existing and familiar governance frameworks to build and implement AI governance programs that integrate with other organizational frameworks. Nonetheless, unique characteristics of AI regulation will ultimately require companies to build bespoke components to any solution they implement. 

3. Consensus emerged on the need for enhanced protection of children.

Sessions dedicated to children’s privacy underscored the critical importance of comprehensive regulations to safeguard the digital well-being of minors. Panelists delved into the intricacies of emerging state laws, such as the California Age Appropriate Design Code Act (building in privacy and safety by default), social media age verification laws (putting parents in charge of children’s online use) and amendments to existing privacy laws tailored to address the unique vulnerabilities of children and teens online. 

Speakers and panelists underscored the collective responsibility of stakeholders to prioritize protecting children’s privacy rights while the law settled around enhanced obligations for an increasingly older range of minors. In preparation for new online safety laws, panelists suggested companies consider aligning data practices with the “best interests of the child” to get in front of the ever-evolving children’s privacy landscape. Given the current state of the law, we highly recommend working with legal counsel to assess your risk and the optimal strategy for addressing the new standards for protecting children online.

4. Cooperative engagement with regulators is wise.

As with previous years, discussions with regulators and enforcement bodies were front and center. Regulators on multiple panels indicated they are increasing their enforcement staff and preparing to intensify enforcement. However, several regulators acknowledged a lack of resources requires them to emphasize enforcement in higher-risk and greater-impact areas with the hope for trickle-down effects. 

Due to resource constraints, many regulators acknowledged that investigations may not materialize into full enforcement actions, particularly when companies demonstrate improvements to identified deficiencies and cooperate with regulators in addressing their inquiries. As a result, we highly recommend working closely with legal counsel in the event you receive a request from a regulator in relation to your compliance with applicable law, as a carefully crafted response may satisfy the regulator’s agenda and avoid an enforcement action. 

5. Litigation risk continues to expand.

Multiple sessions made clear that, as technologies continue to develop and as states and regulators continue to enact new laws and regulations in attempts to protect the public—and the public’s personal data—from the potential risks associated with new technology, the litigation risk for companies continues to increase, both in volume and in value. Indeed, over the past five years, the number of federal data privacy lawsuits filed each year has more than doubled, and verdicts and settlements have become increasingly more costly.

While some of this increase is due to the creative application of old laws (such as laws prohibiting wiretapping) to new technology, much of the increase is due to the continued proliferation of state privacy statutes that provide for statutory damages, like the Illinois Biometric Information Privacy Act and the California Consumer Privacy Act’s data breach private right of action. Because these statutes provide for statutory damages, companies are unable to seek the dismissal of a case on the grounds that the plaintiffs are unable to show they were actually damaged as a result of an alleged privacy violation, a key defense against non-statutory claims. 

As with any new statute, the law interpreting and applying these statutes is in flux and rapidly changing. Should claims under these or similar statutes be filed against you, you should quickly seek advice from legal counsel on the state of the law and your potential defenses or liability. 

If you have questions reach out to our authors (Emily Tabatabai, Shannon Yavorsky, Nicholas Farnsworth, Matthew LaBrie, James Chou, and Jared Mark) or other members of the Orrick team.