Should the U.S. Follow Europe in Imposing Stricter Data-Privacy Regulations ?
June 19, 2018
The European Union
The European Union has taken a huge step in regulating the internet with its General Data Protection Regulation. Should the U.S. follow suit?.
The GDPR gives individuals significant new privacy rights and adds more responsibilities for companies. Those that don’t follow the rules—or that don’t show that they that are responsive to people’s privacy demands—face steep fines, up to 4% of their world-wide annual revenue, or €20 million (about $23 million), whichever is higher. The law, which took effect on May 25, also applies to companies that don’t have a physical presence within the EU, so long as their business targets users who are in the EU. Though there are exceptions, in many cases, companies must obtain affirmative consent to use European residents’ personal information, and when they do, the consent can’t be made a condition of using a service. People who are covered by the law can ask to see all of the information that a company has about them, and they can ask that the information be corrected or deleted. In addition, people can ask that companies not use automated processes when making big decisions about them, such as financial or legal matters. Those aren’t the only responsibilities companies face. They must limit their data collection, taking only what’s necessary to do the job at hand, and must delete information about people as soon as it is no longer needed.
Businesses also have to give individuals clear and understandable explanations of what they do with their data and why. What’s more, in broader terms, the companies need to do impact assessments to see how their new products or services are going to affect privacy. Jonathan Taplin, the director emeritus of the Annenberg Innovation Lab at the University of Southern California, says the U.S. should follow Europe’s lead to counter tech companies’ widespread collection of personal data. Julian Sanchez, a senior fellow at the Cato Institute and contributing editor for Reason magazine, argues against, saying generic regulations aren’t the most effective way to give users effective privacy choices.
Yes: We must rein in big tech and get our privacy back.
By Jonathan Taplin.
At a Google conference in May, the company unveiled its Duplex digital assistant. In an onstage demo, it dialed a hair salon and negotiated to book an appointment, mimicking the “ums” and “hmms” of human speech. The receptionist had no idea she was talking to a robot. In the auditorium, the assembled coders applauded.
If ever there were a metaphor for how clueless Big Tech is about the notion of privacy, and technology’s growing role in our artificial-intelligence-mediated world, this was it. That the crowd of programmers was pleased by the idea that an AI assistant could fool a human should give us pause. Because we are being fooled—into handing over details of our lives for the benefit of tech giants. Until this moment, the biggest tech companies—Google, Facebook and Amazon—have collected personal data on more than 2.5 billion people around the globe. That may include your religious and political affiliation, sexual preference, shopping history, every location you have visited online and off line, and your favorite movies, music and TV shows. This data is then used to send you targeted advertising at the exact moment that your posts tell the platform providers you are most susceptible to a commercial message. And they do it with a free hand. What other business sector operates under a very wide liability shield? What other sector can ignore copyright protections and post music, TV and journalism without paying for it?
These developments were proceeding at light speed until the European Union started taking aim at Silicon Valley. The EU’s General Data Protection Regulation is the biggest step yet toward undoing the 20-year regime that has benefited Big Tech. I believe that the U.S. should follow the EU model and impose our own version of GDPR. Critics of GDPR say that the new rules will only aid Big Tech monopolies that can afford compliance costs. But anyone who has been paying attention knows, sites both big and small have been sending out notices of a change to their privacy policies to conform to GDPR. The cost to post a notice on your site is pretty nominal. If you are not a data collector, you have no worries or compliance cost. Those critics also suggest that giving consumers choices of privacy options is self-defeating—it involves lots of “consent/not consent” boxes to click until consumers become inured to privacy concerns. But people who care about privacy will take the time to opt out. Those who don’t will opt in. And not all sites will make the process complex. Unlike Facebook and Google, which present consumers with a maze to navigate, Twitter has a single page with two boxes to click. And GDPR does more to protect privacy than just demand consent in some cases.
Then there’s the proposition from critics that Facebook’s and Google’s ad sales will benefit from GDPR, at the expense of other companies. Those critics do not really understand the ad market. Last year, Google and Facebook took 87% of the incremental growth in U.S. digital advertising, estimates Jason Kint, CEO of Digital Content Next. That domination was fueled by their control of consumer data on billions of users. The GDPR’s rule that user consent must be “freely given” could reduce that strategic benefit, without which the Digital Duopoly might begin to crumble. There are already signs that programmatic advertising is declining. Clearly, a user who opts out of data collection is much less valuable than one who opts in. This should change, because in many cases, Google would only know the user’s age and sex. It would stand to reason that trusted publishers could compete for more of those dollars.
We must set our own privacy regulations in place to make it happen. Around the same time Google was unveiling its talking assistant, the president’s top technology adviser told representatives of Google, Facebook and Amazon that they will be offered resources and freedom to explore AI development. We the people will be continuing to fund the Silicon Valley gravy train, while a few billionaires reap most of the benefits. It needs to stop.
Mr. Taplin is the director emeritus of the Annenberg Innovation Lab at the University of Southern California and the author of “Move Fast and Break Things: How Facebook, Google and Amazon Cornered Culture and Undermined Democracy.
No: The rules impose high costs—with few benefits.
By Julian Sanchez.
The European Union’s new consumer-privacy regime has gone into effect, a fact you’re probably at least vaguely aware of thanks to the mountain of “privacy-policy change” notices piling up in your inbox.
The new rules have many American privacy advocates gazing enviously across the pond. Yet there is ample reason to doubt that rules in the GDPR mold will yield meaningful benefits that justify the costs they impose. The best argument for data-protection regulation has been that the current dominant approach to protecting privacy is a sham: Online platforms have long given users notice of how their data will be used in vague, legalistic and lengthy terms-of-service agreements, which users almost universally “consent” to by clicking “agree” without reading a word. The GDPR “solution” in some ways assumes the problem is that we haven’t given users enough fine print to read or buttons to click.
We’ve already had a preview of how well that approach works: You’ve probably visited a website that, in response to existing EU rules, throws up a banner forcing you to agree to their data policies or click through pages of options before proceeding. And, if you’re like most people, you’ve honed your reflexes to click through these minor annoyances as quickly and automatically as possible. Like antibiotics, such notices may work when used sparingly, but tend to become ineffective when deployed indiscriminately. To be sure, the GDPR has plenty of other restrictions on how data is used. But when the law demands ritual box-checking even for ubiquitous and, to most of us, unobjectionable uses of data, users are conditioned to speed through the nuisance by simply clicking “agree.”
That doesn’t mean it is impossible to give users more robust and meaningful control over the use of their data, but what’s the most effective way to make privacy choices salient and intelligible? Generic regulations aren’t just ill-suited to solving that problem; they may be counterproductive. As Berkeley professors Kenneth A. Bamberger and Deirdre K. Mulligan report in their book “Privacy on the Ground: Driving Corporate Behavior in the United States and Europe,” regulation focusing on formalistic methods such as long click-through “consent” mechanisms can diminish attention and resources companies give privacy issues and foster a “compliance mentality.”
On the other side of the ledger are compliance costs of such regimes, which aren’t borne only by behemoths like Facebook and Google.
The GDPR defines “personal information” broadly to include, among other things, routinely logged data like internet-protocol addresses. Countless companies (and nonprofits) that few of us would consider privacy threats are saddled not only with ensuring their data-use notifications satisfy EU standards, but also with developing mechanisms to handle requests to purge or provide data. Those costs are a rounding error for a Google or a Facebook, but less so for smaller companies And that is hardly the only way the rules tend to favor the digital economy’s lumbering dinosaurs over its scrappy mammals. We’ve become accustomed to a cornucopia of free online content and services underwritten by advertising—and, increasingly, targeted advertising fueled by data.
That gives an advantage to the biggest players with the most data to mine: The most vocal proponents of privacy regulation are often equally concerned about the disproportionate power of big players. Introducing more regulatory friction into the process of monetizing data is virtually guaranteed to give big players more power. There is one element of the GDPR worth copying: the requirement that data custodians notify users promptly in the event of a breach.
Companies are notoriously averse to publicizing the fact that they’ve been hacked. Users at minimum need basic information about which companies are fulfilling their obligation to safeguard data and which aren’t. With that exception, Americans shouldn’t be too eager to emulate our European cousins’ approach to data protection. Much of the rigmarole around boarding a plane since 2001 is justly derided as security theater—an elaborate performance that has more to do with reassuring travelers than detecting real threats. The GDPR is a similar form of privacy theater.
Mr. Sanchez is a senior fellow at the Cato Institute and contributing editor for Reason magazine..