Take 5 for Privacy Podcast – A Conversation with Jules Polonetsky, CEO of the Future of Privacy Forum


October.29.2020

In our second episode, Heather Egan Sussman talks with Jules Polonetsky, CEO of the Future of Privacy Forum, about the benefits of a comprehensive U.S. privacy law. They also discuss the need to address the trust deficit with the use of data for certain purposes while minimizing risk, the tension between advertising and marketing and the productive use of data, and why we need to get comfortable with how we de-identify data without removing its utility.

 

About our guest:

Jules Polonetsky, CEO of the Future of Privacy ForumJules Polonetsky serves as CEO of the Future of Privacy Forum, a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Jules previous roles have included serving as Chief Privacy Officer at both AOL and DoubleClick. A former Consumer Affairs Commissioner for New York City, he was also an elected New York State Legislator, a congressional staffer, and an attorney in private practice. Jules has served on the boards of several privacy and consumer protection organizations including TRUSTe, the International Association of Privacy Professionals, and the Network Advertising Initiative. From 2011-2012, Jules served on the Department of Homeland Security Data Privacy and Integrity Advisory Committee. Jules is a member of The George Washington University Law School Privacy and Security Advisory Council.

  • Heather Sussman:

    Hello and welcome to “Take 5 for Privacy,” a podcast where we interview notable practitioners in privacy, asking the same five questions to start a conversation about timely and hot topics in privacy today.

    My name is Heather Egan Sussman and I’m a partner in the Cyber, Privacy and Data Innovation practice at Orrick, Herrington and Sutcliffe.  Joining me today is a long-time friend, Jules Polonetsky, who is the CEO of the Future of Privacy Forum, a Washington, D.C.-based, non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies.

    The FPF is supported by the chief privacy officers of more than 150 leading companies and several foundations, including the U.S. National Science Foundation, as well as by an advisory board comprised of the country’s leading academics and advocates, including our law firm, Orrick.

    FBF’s current projects focus on AI, ad tech, ethics, and data-driven research, health, smart cities, connected cars and student privacy.

    Welcome, Jules.  We are so happy to have you.

    Jules Polonetsky:

    Thanks so much.  It’s been great to have some of your sage counsel over the years.  Thank you for all the work you do in programs like this for the broader community.

    Heather:

    The pleasure is all mine and all ours.  So, Jules, why don’t we kick it off?

    In your view, what is the most critical issue in privacy today?

    Jules:

    I think we are incredibly challenged right now.  We’re at a time where data is more important than ever for research, for healthcare, for understanding where there are disparities in how people are being treated, finding discrimination, innovations, machine learning.  There’s not an area where data and tech aren’t being used to move forward.  But we have a huge trust deficit.  We want our sensitive data used in ways that will help us.  We want to know what will work for improving healthcare.  We need information, but we’re so skeptical of the government, of big companies, of institutions.  I think our challenge is who and what trusted structures can we put in place—laws, institutions, technologies—so that we can get the benefits that we know are feasible if responsible rules are in place, so the biggest challenge is addressing the trust deficit.  And I think it’s on folks like us to work with others to find those structures so we can get the benefits and deter, minimize, the very real concerns.

    Heather:

    Yeah.  I hear you.  Trust deficit and creating trusted institutions.  So, how to we address that issue?  What are the practical ways we can tackling it in your view? 

    Jules:

    Well, you start with law.  We’re not in a position anymore where we will or self-regulation.  Those have roles where we can simply say, “Trust us or sue us something goes wrong.”  We need in the U.S. a comprehensive privacy law not just because it’s getting complicated with the states and not just because we want to show that there is a U.S. model that is not exactly GDPR and our own way of doing it.  We need it because it will set a baseline of trusted understanding that this is what you can do and this is what you can’t do.  It will set a fair playing field so certain kinds of companies don’t push the envelope while others are careful and then lose out and are sort of forced to compete by going to the same direction.  So, we start with getting a comprehensive law in place, number one.  Number two, we need to fix the long-time tensions around advertising and marketing.  Because policymakers are worried about ad tech, we’re losing the ability to do basic analytics.  Because of ad tech, we have CCPA.  Because, oh my god, you know, what’s happening with ad tech doing this or that with data?  Clearly, there’s an important role for advertising and marketing.  Let’s put the right rules around it. Let those people then go compete with each other.  Advertisers are going to advertise, are going to spend money.  Let’s settle the non-stop debate and the concerns about cookies and tracking and this and that, so that they can move on.  And then all the other uses of data that we want for a whole range of purposes that are helpful and useful, we can go ahead and move forward.  So, we need law.  We need to kind of bite the bullet and figure out what the right rules are for ad tech to sort of settle the non-stop debates around that.  And we need to get comfort with how to actually de-identify data in ways that don’t remove the utility of data. Right? Increasingly, the challenges to the identification are significant, but, increasingly, there are new technologies and new techniques, but still a lot of confusion.  Who knows exactly what the identification takes you out of CCPA?  Nobody knows.  The smartest lawyers in the land:  “Well, I don’t know exactly what the AG would think or Mr. McTaggart would think or what the critics would think,” right?  So, we need to get the technical, legal policy folks together so that we can hammer out here’s when the risk is actually low and you don’t need to worry about all of the restrictions.  Or maybe there are some restrictions, but others can be more flexible because you’ve got real, technical safeguards in place.  So, I would start with the identification, getting the consensus around that, because that would open up, again, many areas where we can get value of data and guarantee against some of the concerns and harm.

    Heather:

    Jules, tell us what is your privacy pet peeve.

    Jules:

    One of my privacy pet peeves is privacy notices that start with, “We protect your privacy.  We respect your privacy.”  You know, I don’t know if my twenty years have made me cynical, but I’ve always—also had the chance to listen in on sessions with consumers where they were asked to read policies and then say what do they think.  As soon as the policy starts with “We respect your privacy,” you stop them.  You say, “How does that make you feel?”  And they’re like, “And, now they’re going to tell me how they sell my data.”  I think we don’t give people credit.  They understand that this is where you’re telling them not about the privacy—because if you were about privacy, maybe you’re like a security company or a privacy tool—you’re not.  You’re using their data for legitimate purposes and you’re going to have data protection around it and you’re going to have the right legal basis.  You know, the Europeans have it right, right?  Your privacy policy is actually your data protection notice because you have a legitimate business, you have reasons to use data and you’re going to tell them what your basis is for why; maybe there’s a law or there’s a contract or so forth.  And here are the purposes and here is what they can object to and what they don’t object to.  That’s a lot more straight-forward.  So, take a lesson, perhaps those who are U.S. practitioners.  Stop promising privacy in your privacy notice.  Explain what you do with the data.  Now, look, I’m not going to complain about long and detailed policies because lawyers know these are complicated documents, that’s right.  But what we’re doing is we’re failing communicating to the consumer in some holistic, high-level way, but for the geeks who are reading those policies, like me, who really want to understand “Hey, how is the company doing this versus that company?” and I’m reading it often because I’m trying to baseline for companies what a best practice is by looking at what everyone says.  And, if I can’t figure out what you do, with twenty years of reading policies, working in data, and I’m reading the policy and I’m looking for the kinds of specific questions about sharing and different kinds of uses, you know, and I’m not being confused and assuming everything is for sale—if I can’t figure out what’s going on—so you’re not meeting the needs of the critic, the advocate, the detailed regulator, and you’re not meeting the consumer need. So I love the more holistic consumer statement that can be crafted to stay legally accurate but high-level.  It can be done.  And then give those FAQs that are the detailed—that are the critic—that the expert wants.  Right now, we are clobbing those together and nobody is being served whatsoever.

    Heather:

    So, coming up with better ways to present your data practices in ways that are presenting them through straight talk makes perfect sense and is a good offshoot of that concern.

    Jules:

    I like to call it “featurizing” the way you use data, right?  These are core functionalities.  “I will send you the product.”  “I will give you access to the history of what you purchased.”  Right?  These are core capabilities that are needed now, sometimes to comply with the law. You might not otherwise do it.  But, sometimes they’re actually useful features.  And, except that we’re able to talk about these things as “features,” people understand that.  “Oh, where are the settings?  What can I do?  Can I turn this off?  Can I turn that off?”  So, to the extent that we “featurize” data use, and people understand.  “I toggle.  I ask.  I tweak.”  We get it.  That’s how the Internet works. 

    Heather:

    Jules, it makes a lot of sense.  Featurizing a notice because that also addresses that trust deficit that you were talking about and building trusted institutions.  It’s that straight talk.  It’s about being more transparent and delivering information in ways that are easily digestible but still legally complete and sound. 

    Jules:

    Well, since we didn’t do for ourselves, Apple is going to do it for us, right?  People who are using iOS 14 apps—or listeners of your show who are perhaps helping to write privacy policies—are going to be filling out a special survey and then Apple will be displaying them to users in the format they have set forward.  So, it’s being done for us.  We might as well consider ways we can adopt this notion more broadly, and of course it’s hard, right?  It’s not easy to decide what will fit in each box, but that’s why we get the big bucks, right?  It’s our job to figure it out.

    Heather:

    Well, that’s exactly the kind of innovative solutions that we’re looking for from FPF.  And you have that broad reach into many different perspectives and companies, so I very much appreciate that perspective.

    Can you share a challenge that you’ve had to overcome in privacy and the lessons that you’ve learned from that experience?

    Jules:

    I’ve learned that we all know our own discipline, our own sector, and we don’t know the sectors that are adjacent to us.  But yet, at the end of the day, a phone, a computer, a service—these things are all linked and tied together.  And even in privacy, there are people in philosophy that are working on privacy. There are people in ethics that are dealing with privacy.  The academic disciplines don’t talk to each other.  And industry doesn’t often know how to engage with academia, and civil society is coming in a whole other place and different cultures.  So, this is one of the most cross-disciplinary, cross-cultural, cross-different kinds of people and most of us aren’t super good at that.  We’re lawyers.  We’re technologists.  We’re not sociologists.  And we’re not these cross-disciplinary experts.  Maybe lawyers end up being good at this because, you know, you show up in court or you get a new client.  You learn what you need to know to become a good enough expert to counsel, to litigate, to learn, you know, the particular technology, but most of us weren’t trained in a way that teaches us how to influence practice, take information from stakeholders.  You know, anyone who thinks that this is about privacy nowadays and not about power and civil rights and human rights, you can do everything right.  Dot all the i’s.  Cross all the t’s.  And be clobbered because a particular group feels that you’re using your power as your company, as government, as somebody in authority, as a leader of a school, in ways that will harm them because of data you have, information you have—a teacher will be fired unfairly because of an assessment.  An AI will, you know, put kids in England, you know, into the colleges that they expect to go into because of how it grades them, right?  So, engaging stakeholders who have a voice, and if they’re not bought in in how you’re using data in a way that is legitimate and lawful and so forth, you’re going to have that backlash and that’s not a set of schools—that’s something that I really had to learn though lots of pain and suffering and education.

    Heather:

    That’s so interesting.  I was a sociology major, undergrad.  And I never made that connection.  I actually ended up being a sociology major because I kept signing up for the classes that looked so fascinating to me and I really enjoyed taking those classes.  I ended up becoming a privacy lawyer because I studied and pursued and followed and did the things that I really enjoyed or fascinating or interesting topics.  For many of the reasons that you talked about—data as power, civil rights, human rights, discrimination, all of it—but also new and evolving technology and innovation.  So, that’s my fun fact about me.

    Can you tell us a fun fact about you that people might not know?

    Jules:

    I’ve run for office at the—and worked—at the city, state and federal levels of government, which I think is a bit unusual.  I was an elected state legislator, a city consumer affairs commissioner and congressional staffer, and I’m a political junkie involved in local, state and now presidential campaigns, so that’s a fun fact.  When I’m not doing privacy, I’m doing politics.

    Heather:

    Well, thank you for your public service, for all the terrific work that you’re doing on behalf of the Future of Privacy Forum.  It’s just a wonderful organization and your contributions to this profession and to advancing important policy discussions and supporting innovation and principled data practices in the United States and beyond have been just really critical to the development of, you know, the future of our profession.

    So, thank you, Jules.  Our five questions are complete, and I can’t wait to have you as a guest again.  But thank you for joining us today. 

    Jules:

    It was great to be with you.  I look forward to catching up in person.  Thanks so much for what you’ve done for community and all your efforts and insights along the way.