Interview with Ari Ezra Waldman on privacy, power and civil rights (German version will be published in PinG 1.22)
Ari Ezra Waldman is a professor of law and computer science at Northeastern University in Boston, Massachusetts. He studies how law and technology effect marginalized populations, with particular focus on privacy, misinformation and the LGBT community. He has written numerous articles, both in leading US law reviews and in the popular press.
In September 2021, Professor Waldman published his second book “Industry Unbound – The Inside Story of Privacy, Data, and Corporate Power” (Cambridge University Press), in which he describes how, in his view, the tech industry conducts its ongoing crusade to undermine our privacy. With research based on interviews with tech employees and internal documents outlining corporate strategies, he demonstrates how companies do not just lobby against privacy law; they also manipulate how we think about privacy, how their employees approach their work, and how they weaken the law to make data-extractive products the norm. In contrast to those who claim that privacy law is getting stronger, Professor Waldman argues that recent shifts in privacy law are precisely the kinds of changes that corporations want and that even those who think of themselves as privacy advocates often unwittingly facilitate corporate malfeasance.
In a conversation with Niko Härting, Ari explains what “Industry Unbound” is all about and why he thinks that it is an illusion to believe that the GDPR can help to rein in corporate power and privacy violations. He believes that Big Tech has co-opted the GDPR and used it for their own ends. While he sees an urgent need for grassroots social movements with strong privacy arms, he does not expect privacy advocacy from privacy professionals and privacy lawyers who he calls the “privacy professional class”.
Niko Härting: When the GDPR was adopted in 2016, many Europeans hoped that the new rules would give European businesses an advantage over the US internet giants. There was a lot of talk about a ‘level playing field’ between EU and US companies that would be achieved (e.g. https://ec.europa.eu/commission/presscorner/detail/en/STATEMENT_16_1403). A few years later: How have the “Big Five” coped with the GDPR? Has the goal of creating a “level playing field” been achieved?
Ari Ezra Waldman: Big Tech has not only co-opted the GDPR, they have adapted it to achieve the goal of unrestricted data collection. The GDPR is a process-based law. People can talk all they want about its supposed origins in European human rights discourse. But the law itself is almost entirely based on process. It requires ongoing compliance via internal organizational structures. Even its individual rights of access, correction, and such require internal corporate structures of compliance for assessment of requests. And those structures are not built with public’s interests in mind. Indeed, the opposite is true. As I describe in “Industry Unbound”, privacy impact assessments, data protection officers, record keeping, and all the other procedural requirements of the GDPR are coordinated within corporate organizations that are built for one thing: profit. In that context, the law is co-opted. PIAs become little more than assessments of whether a company is going to get sued or investigated, not whether a product will materially impact the privacy of consumers. Privacy and data protection offices are often edifices, even if they have the ear of the executive. Technology companies take advantage of process to legitimate, rather than constrain, already existing extractive data practices. And this is just one of the ways the GDPR hasn’t levelled the playing field for anyone. It has just contributed to the amplification of power of those who already had a lot of power.
Niko Härting: Do I understand you correctly: You regard data protection officers, record keeping, privacy impact assessments and other procedural requirements as fig leaves that help Big Tech to disregard privacy and to make even more profit? I don’t think you will find many privacy practitioners working for the “Big Five” that would agree.
Ari Ezra Waldman: In many cases, technology companies of all stripes are performing accountability in the most cynical way possible. In my four years of research, I found companies brag about their privacy offices, but then split their budgets between three other departments (IT, Legal, Compliance), so nothing gets done. I saw companies tell researchers that they integrate privacy into the weeds of their business units, but then use reporting hierarchies to prevent their privacy lawyers and privacy professionals from having any impact on design unless a question trickles up the food chain to them. I saw PIA templates on workers‘ desks with the instruction, „just check here“, to indicate that nothing they did posed privacy risks. I saw a company issue press releases about hiring privacy engineers, but then only let those engineers check the code other people wrote at the very end of the process, making it very unlikely anything was going to change. I saw the much vaunted „audit“ turn into a silly exercise where an assessor hired by the company comes in and fills out a series of questions about compliance with FTC court orders based entirely on the attestations of executives. I saw companies focusing entirely on manipulating consent in order to meet one of the GDPR’s lawful bases for data collection and then proceeding to ignore other requirements. I saw companies make arguments in litigation in which they said no one has privacy rights once people disclose any information on their platforms at the very same time those companies were filling out PIAs, hiring privacy officers, and conducting assessments. I am not saying every single person in every single tech company does this. What I am saying is that the notion that the GDPR’s procedural tools could ever constrain data extraction is a myth because those procedural tools are integrated into corporate organizational structures designed to co-opt them.
Niko Härting: European privacy experts would argue that what you describe are clear breaches of the procedural requirements so that the problem are not the requirements but the lack of DPA resources necessary to control compliance and enforce GDPR rules by sanctions and hefty fines.
Ari Ezra Waldman: So, then why create a system of public governance that relies on governance structures you know cannot enforce even the most basic requirements? This argument is an excuse and false. Many privacy professionals and privacy lawyers say that this is simply what privacy law is. And there is nothing in the GDPR that makes any of it unlawful. So, if you are right that privacy experts would say what I describe is a breach of the law, they are thinking of the law as they would like it to be, not the way it is in real life, which is what my research is about.
Niko Härting: In the days before the GDPR, most privacy experts in Europe would hold the view that we had strong rules but virtually no enforcement. The GDPR did not substantially alter the rules but very much relied on strengthening enforcement. “Privacy now has teeth.“ You are now saying that the rules simply do not work. What kind of rules would you suggest in order to protect what you call the “supposed origins in European human rights discourse”?
Ari Ezra Waldman: Serious scholars I admire have called for putting substantive requirements like duties of loyalty on top of the GDPR’s procedural rules. The justification for proposals like that is that they impose affirmative limits on data collection and processing, far beyond process. There are, after all, some situations where no amount of process is sufficient to protect the substantive human rights of individuals. I still worry that these affirmative obligations will be co-opted, as well. You cannot escape the suffocating power of performative compliance. If privacy is a human right, it cannot depend on the good graces of a corporation, a data protection officer, a chief privacy officer, or a privacy lawyer filling out a form. Besides an obvious–and MASSIVE–reinvestment in public governance unlike we’ve seen in decades, we need to start thinking about privacy and data protection differently. Privacy is labor protections, so researchers who want to publish papers challenging their employers‘ bottom line won’t get fired, as happened at Google. Privacy is public governance that does not rely on audits and systems that Julie Cohen called „on the periphery of the regulatory state“. Privacy is criminal penalties for executives. Privacy is environmental protections that stop companies who want to transport data across the globe from destroying delicate ocean ecosystems. But in the end, privacy can only be achieved when the information industry is, essentially, nationalized for the public good. And for those who say that is not going to happen tomorrow, well, what are you doing now so it could happen later on.
Niko Härting: I totally agree we have to talk about the purposes of privacy and the links to many other fields, issues and human rights. Might it be the lack of a clear focus and the lack of reflection on the purposes of privacy rules that has led to empty and mindless compliance routines? You do not have to know anything about human rights when your job is drafting GDPR compliant privacy policies or keeping “records of processing activities”.
Ari Ezra Waldman: I think the „privacy is fuzzy“ or „privacy is complex“ is an excuse, too. “What Is Privacy For” is quite literally the name of a canonical article written by Georgetown law professor Julie Cohen in the Harvard Law Review. Why Privacy Is Important is also quite literally the name of a book just published by Neil Richards of Washington University in St. Louis. Daniel Solove wrote a book more than a decade ago called Understanding Privacy. I do not think we have any trouble understanding or discussing or defining privacy, if we really wanted to. But, no. Tech companies would rather perpetuate the myth that privacy does not mean anything clear enough to put into law, or that privacy does not mean anything clear enough to integrate into design. That allows them to say „we can’t“ rather than the truth, which is „we won’t“.
Niko Härting: I wonder if it is only the tech companies that manipulate privacy debates or if it is not time – much more generally – to re-politicize privacy. In the early days of data protection in the 1960s and the 1970s, data protection was very political. The discussion was about government surveillance and privacy as a pre-condition for free speech, the freedom of assembly, the freedom of association and others. Nowadays the focus is on cookies and checkboxes and on the correct wording of privacy policies. Re-politicizing the debate might also mean to accept that the core problem with Big Tech is not privacy but power. If you want to break the power of giant global companies, enforcing “strict” privacy rules will not do the job but you have to break them up with strong tools out of the tool box that anti-trust laws (hopefully) provide.
Ari Ezra Waldman: Yes! This is a fantastic idea. Privacy is indeed about power, and anyone, no matter what they tell you they’re doing, that thinks individualized consent should play a role in privacy law does not understand how choice and consent shift power to and entrench it with companies. That said, we need to politicize privacy in the right way. Real change can only come with increasing consciousness of the risks and dangers of surveillance. We need to take as our model the idea of non-reformist reforms from André Gorz. Even reforms that seem positive have to come from an active social movement that not only wrests power from tech companies to the people, but simultaneously raises the consciousness of the people that today’s informational capitalism, and its neoliberal governance systems, subordinates us.
Niko Härting: Active social movement? I am sceptical. In Europe, we have seen a sharp increase in government surveillance during the Covid crisis with very little resistance. And in spite of so much criticism of the power of Big Tech by intellectuals, most of those intellectuals will be daily users of Big Tech’s apps and tools. When governments show teeth to tech companies (e.g. by raising their taxes), most citizens will applaud. However, I do not see a social movement on the horizon demanding the break-up of one of the GAFAM companies.
Ari Ezra Waldman: That is the point. I do not see one either. We cannot wake up tomorrow with an aggressive, consciousness-raising, ground-up social movement without the infrastructure for that movement in place. There are civil society organizations out there that have too narrowly conceptualized their missions. Some are so pro-corporate that they do not want to change their tunes. But there are others that have been more aggressive, but seem only willing to focus on the margins: a new agency, a private right of action (at least in the United States). There is not going to any social movement built around a new federal agency focused on privacy. Privacy is about power. Every single organizations dedicated to advancing the liberation of marginalized populations should have an arm galvanizing that community around privacy and the ways in which Big Tech subordinate them. I am thinking of organizations like Data for Black Lives as a good example. I am queer. Where is the Human Rights Campaign, a major US LGBTQ advocacy group, in the privacy space? Data extraction disproportionately harms those who the most marginalized, who face repeat and harmful intersections with public and private data collectors, and who have stigmatizing social identities. Why are we not building a queer movement around privacy?
Niko Härting: Actually, there are two queers in this conversation. And the LGBTQ movement might be a good example how the focus on privacy changes over time. Back in the 1980s, privacy was a big issue in the LGBTQ movement as privacy was under attack in the AIDS crisis, and privacy was a cornerstone in the fight against discrimination. Nowadays, there is little talk about privacy in a LGBTQ context. Could we learn from that example that privacy should always be put in (a political) context if we want privacy to be relevant. In a context with the fight against discrimination, against corporate power or similar issues? You will not find any such context in the GDPR. That is why you can easily be avid follower of the GDPR without understanding anything about human rights. When privacy and data protection become ends in themselves and not much more than another chapter in corporate compliance books, the misuse of power, government surveillance and the discrimination of minorities can continue undisturbed by busy privacy professionals.
Ari Ezra Waldman: That is a really good point, and one that I wish would be more front and center in our understanding of the history of our community. Scholars like Bill Eskridge, reporters like Sarah Schulman, and activists like Peter Staley have written tremendous books about the fight for queer rights in the last century. Professor Eskridge’s work was about the law, including the law of privacy. Schulman and Staley, because they were situated as a reporter on the ACT UP beat and one the leaders of ACT UP NY, respectively, take a more on the ground perspective. Staley’s memoir is just fantastic, though I have to admit I have not finished reading it. But there’s more history to be done: We need to understand the history of those years through a privacy lens, too. Maybe that is my next book: privacy advocacy with ACT UP as a model? Ha!
Privacy is particularly important for the queer community, and I do not just mean sexual privacy, our ability to protect our identities in a world of hostile institutional and implicit discrimination, and our right to live free of the criminalization of our sexuality and sexual conduct. Privacy is also about the propriety of using our sexual and gender identities as capitalistic weapons of manipulation. The latter point was of course not part of the discourse during the AIDS Crisis. But we have the capacity to make it part of the discourse now.
Niko Härting: “ACT UP as a model”, I like that. At least in Europe, we do not see a lot of privacy advocacy these days. Instead, we see an increasing number privacy lawyers and professionals, of seminars and conferences. At those conferences, we very much discuss privacy the same way money-laundering would be discussed at an money-laundering law compliance conference. We discuss how privacy policies should be drafted how we interpret Art. 25 GDPR (privacy by design/default). We rarely touch on government surveillance or Big Tech power. We occasionally pay lip service to human rights but hardly ever discuss civil liberty issues. Therefore, I am sceptical about privacy advocacy at the moment and would rather find privacy needs to be more imbedded in issues like power and discrimination. What about “anti-trust advocacy with ACT Up as a model” with privacy being piece and parcel of the debate?
Ari Ezra Waldman: When I say privacy advocacy, I absolutely do not mean that it is going to come from privacy professionals and privacy lawyers. The privacy professional class is just that, a professional class of elite workers with a comfortable living. They will be inherently conservative because the status quo has served them well and the increasing professionalization of the group will make bourgeois values the norm.
ACT UP was not a bourgeois revolution. It was a revolt of the powerless, the people even other marginalized groups looked down upon and shunned: queer people. You are right that privacy advocacy needs to focused on issues of structural power and discrimination, and the only way that is possible is if it comes from the people: Black people, queer people, women, Latinx people, religious minorities, and anyone who wants to join in the cause.