Marsh & McLennan Advantage Insights logo
Conversations and insights from the edge of global business
Menu Search

BRINK News is transitioning to This Moment platform on MarshMcLennan.com as of March 31, 2023. Read the update here.

Technology

We Are All Working in Digital Sweatshops

The digital information ecosystem farms people for their attention, ideas and data in exchange for so called “free” services. Unlike their analog equivalents, these sweatshops of the connected world extract more than one’s labor, and while clocking into the online factory is effortless, it is often impossible to clock out.

I am reminded of this state of affairs by the recent stream of messages from online service providers about changes to their terms and conditions and privacy policies. The messages vary, of course, and some explicitly cite the General Data Protection Regulation (GDPR), which comes into force this week, as the reason for the change. Failure to accept the new terms by May 25, we are told, will mean we can no longer use these services.

A Hint of Menace

For most people outside the esoteric data protection bubble, this represents first contact with the new dispensation of digital rights and obligations in the European Union. If this encounter seems like a take-it-or-leave it proposition—with perhaps a hint of menace—then it is a travesty of at least the spirit of the new regulation, which aims to restore a sense of trust and control over what happens to our online lives.

The recent scandals have served to expose a broken and unbalanced ecosystem reliant on unscrupulous personal data collection and microtargeting for whatever purposes promise to generate clicks and revenue. In such a distorted environment, everyone must now participate, instilling the paradoxical sense of being more and more monitored, and yet less and less known and respected by the small number of remote tech powers.

The Fundamental Right to Privacy

As the state of things digital becomes gradually clearer, there are already noises suggesting that if you object to being tracked in exchange for the “free” services on which many of which our lives now depend, then the only alternative is to pay. But the fundamental right to privacy and related freedoms such as free speech and non-discrimination apply to all; they cannot be the exclusive privilege of those who can afford to pay.

The positive takeaway from all of this is not simply that data protection has suddenly become trendy. Regard for online privacy is now firmly a part of the PR toolkit of any organization which cares about its customers and reputation. The big risk, however, is a growing gulf between hyperbole and reality, where controllers learn to talk a good game while continuing with the same old harmful habits that the EU legislator has been attempting to dispel with the GDPR and other ongoing reforms, notably the ePrivacy Regulation.

Will Stricter Data Protection Favor Big Companies?

So for instance, we have seen a spate of articles alleging that stricter data protection will favor big companies. There is no doubt that controllers tracking people with whom they have no relationship are rightly going to have to adjust their behavior. But the broader reality is that accountability and obligations in the GDPR are scalable, with data protection authorities empowered to enforce the law with rigor proportionate to the scale of the violation. Controllers responsible for personal data processing on a massive scale, involving data of the most sensitive nature, face by far the biggest challenge in demonstrating the lawfulness and, indeed, ethical grounds for what they have been doing over the last decade or two.

Too often, privacy policies have seemed to be designed to provide legal cover for the companies themselves in the case of harm to a customer.

The GDPR is, essentially, about accountability of controllers and safeguards for individuals, including giving them more control over what happens to their data. Its greater goal is to protect individuals, not companies.  

Too often, privacy policies have seemed to be designed to provide legal cover for the companies themselves in the case of harm to a customer: nonnegotiable, incredibly long, complicated and full of legal jargon that nobody reads (except to expose the unfairness of this practice).  Furthermore, the policies have tended to give an illusion of user control, while in reality, you cannot see or control what the company does with information about you.

Consent Cannot be Conditional

Companies whose business models depend on tracking are now asking customers to say whether they agree to, for example, the use of sensitive data by outside sources. Just like with the notorious cookie pop-ups, people feel pushed towards clicking “I accept” because the only apparent alternative seemed complicated, time-consuming and risked excluding them from digital society.

We and other Data Protection Authorities are therefore worried that even the biggest companies may not yet understand that with the GDPR, these manipulative approaches must change. They must change, for instance, to satisfy Article 7(4) of the GDPR, which states that consent cannot be freely given if the provision of a service is made conditional on processing personal data not necessary for the performance of a contract.

Brilliant lawyers will always be able to fashion ingenious arguments to justify almost any practice, but with personal data processing, we need to move to a different model. The old approach is broken and unsustainable; that will be, in my view, the abiding lesson of the Facebook/Cambridge Analytica case.

Data Protection Authorities are taking action, with a new social media subgroup meeting for the first time in mid-May. We must all be vigilant about attempts to game the system.

Controllers will be concerned about compliance, and the most thoughtful and diligent controllers will aim to turn responsible data processing into a competitive advantage. That will be one of main messages of the EDPS Opinion on privacy by design, but what individuals and regulators expect is a change of culture.

Consigned to History

Data protection emerged in the 1970s and ’80s as a response to the automation of processing operations and new forms of communication. Massive digitization and machine learning are demanding new and smarter policy responses: stronger enforcement, but also empowerment through tools such as meaningful consent, ethics and accountability and a fairer allocation of the digital dividend.  

Let’s hope that, like the dehumanizing working conditions of the sweatshops of the 19th century, the abuses so prevalent in this early phase of digitization are soon consigned to the history books.    

This piece was previously published on the EU’s website.

Giovanni Buttarelli

European Data Protection Supervisor of the European Union

Giovanni Buttarelli was appointed European Data Protection Supervisor on 4 December 2014 for a term of five years by a joint decision of the European Parliament and the Council. Before joining the EDPS, he was secretary general at the Italian Data Protection Authority. As a Cassation judge in the Italian judiciary, he has long been involved in many initiatives and committees on data protection and related issues at international level.

Get ahead in a rapidly changing world. Sign up for our daily newsletter. Subscribe
​​