Marsh & McLennan Advantage Insights logo
Conversations and insights from the edge of global business
Menu Search

BRINK News is transitioning to This Moment platform on MarshMcLennan.com as of March 31, 2023. Read the update here.

Technology

Customer Data Can Be Used As Evidence in Government Surveillance

A photo of a woman's torso. She has a camera around her neck and is holding her smartphone and a sign that says in English and German,

The lines between commercial data collection, data privacy and electronic surveillance are becoming increasingly blurred, posing a significant ethical quandary for companies. 

The United States has electronic surveillance laws that prevent authorities from conducting wiretaps and other electronic surveillance without court oversight, but those laws do not necessarily protect our personal data collected commercially. 

That has profound implications in our post-Dobbs world, says Anne Toomey McKenna, professor of Law at University of Richmond School of Law and an expert on privacy, surveillance and law.

Most people are aware that U.S. law provides protections against unlawful electronic surveillance and understand the government generally needs to get a warrant to intercept our communications or track us. 

Specifically, the U.S. Constitution’s Fourth Amendment prohibits unreasonable search and seizure; federal electronic surveillance law prohibits unlawful wiretapping—part of the Electronic Communications Privacy Act or ECPA; and U.S. Supreme Court decisions require law enforcement to get a warrant to search our phones or track us using our location data. 

We Walk Around Shedding Personal Data

Those federal and state law protections from unwarranted electronic surveillance largely evaporate in the face of personal data collected by commercial entities — and that is a lot of data. 

We’re surrounded by smart devices embedded with sophisticated sensors and GPS tracking chips and loaded with data-hungry apps that we use for communicating, social media and other online activities, workouts, getting from place to place, and much more. Our homes are equipped with personal digital assistants, digital doorbells, smart TVs, and other smart devices that record what we and everyone in or near our homes are doing and saying. 

Collectively, these devices generate and collect massive amounts of intimate personal data, including detailed information about our health, activities, preferences, and our location, i.e., what we like or dislike, what we believe, where we go, what we do for work and play, and with whom we do these things. 

Each time we click “I agree” — when we buy a smart device, download all those convenient apps, visit a site, use a search engine, contract with an internet or cell service provider, and so on — we give our consent to the collection of the data our smart devices sense and generate about us. The data collected often has little to do with the service an app or website provides. 

This is possible, in part, because the U.S. lacks any overarching federal data privacy law, although some states have passed data privacy laws. 

The Gray Area Between Data Privacy and Electronic Surveillance Law

When we click “I agree,” our contractual consent essentially waives the legal protections that federal laws like ECPA otherwise afford our private communications and location data. We often do so with little thought; our focus being whatever convenience we seek from an app or device. But it creates a legal gray area, eroding our privacy that electronic surveillance law was designed to protect.

While the Federal Trade Commission regulates companies through consumer protection law, the absence of a federal data privacy law means that companies with clear terms who obtain valid user consent are largely unfettered in their data practices. It also means companies lack both federal restrictions and federal guidance in what data may be collected and how it may be used. 

What happens when law enforcement wants that data? This is the gray area between data privacy and electronic surveillance law. If a data broker is selling data lawfully collected or publicly available aggregated data, what is to stop law enforcement from buying it like anyone else? Not much. That data is a treasure trove from both a financial and an evidence standpoint. 

Personal Data About Personal Decisions

In June, the Supreme Court decided the Dobbs case, overturning Roe v. Wade and reversing a constitutionally guaranteed right to abortion. In some states, all abortions became illegal immediately; other states enacted laws banning most abortions after six weeks. Some states, like Louisiana, are trying to go further, proposing laws that would criminalize certain forms of birth control, like IUDs. How a state legislates abortion and other reproductive rights matters: When state law criminalizes abortion as murder or a felony, it becomes a crime for which investigators can obtain a search warrant under federal and state electronic surveillance laws. 

And that means data in the hands of private entities that reveals some personal decisions is now valuable to law enforcement. In other words, Dobbs puts a whole category of data about some personal activities in the category of potential evidence of a crime. 

The Dilemma for Companies

Companies now face a complex and urgent data privacy situation. A company’s data practices typically occur via automated processes, i.e., AI and machine learning algorithms underpin the app or website. Company executives might not even be aware of how much and what types of data their company’s systems are collecting, using and selling. But the data is on the company’s systems.

This places companies in an ethical quandary. 

Because that data may mean someone’s liberty, reproductive choices, or decisional privacy is compromised. It could mean a customer would go to jail because of a company’s data practices.

While data may be collected with consent, is it truly meaningful consent? Does the user or customer understand what data is collected and how it can be used, i.e., what location data reveals, what patterns of life are revealed, what the myriad of device sensors can sense and reveal about health status? And what does that data mean from an electronic surveillance and criminal prosecution standpoint? Is it evidence of reproductive choices that were recently criminalized?

In short, company executives need to know exactly what data their company is collecting, what data is stored, for how long it is retained, and so on. And whether the data that is being collected is necessary to goods or services the company provides. Because that data may mean someone’s liberty, reproductive choices, or decisional privacy is compromised. It could mean a customer would go to jail because of a company’s data practices.

Shortly after Dobbs was decided, Google announced that it would auto-delete location data for users who visit reproductive health centers. Google immediately understood the risk to Google users: If Google collects and retains this data, it is potentially evidence and law enforcement can collect it. 

If law enforcement issued a simple request for the data, Google could refuse to comply without significant consequences. But if law enforcement served Google with a search warrant based upon probable cause to believe that a person had obtained or committed a now criminalized felony abortion, refusing to comply would mean contempt of court order. That is a much graver risk, particularly for companies smaller than Google.

Google had recognized that, even if its data practices comply with existing data privacy laws, collecting and retaining certain data creates a trove of evidence that puts some at potential risk of prosecution for previously lawful and private reproductive decisions.

The Need for a Federal Privacy Law and Changes in Data Practices

There are both legislative and technological solutions. 

A bill currently in Congress, the American Data Privacy Protection Act or ADPPA, if passed, would provide the U.S. with its first meaningful federal data privacy law. But passage is looking slim. While the bill is still being bandied about, it has not been called to vote. The speaker of the House, Nancy Pelosi, is concerned ADPPA will minimize California’s data law protections. Every day that goes by minimizes its passage chances.

Technology also makes it feasible for companies to make careful, detailed decisions about the data that they collect, maintain and delete. ADPPA addresses some of that. Google has shown that it is possible to make a very targeted decision regarding data handling. But, at the moment, the burden is falling on companies.

Anne Toomey McKenna

Visiting Professor of Law at University of Richmond School of Law

Professor McKenna teaches Civil Procedure, Evidence, Information Privacy Law, and Cyberlaw in Practice at University of Richmond School of Law. She is also Affiliate Faculty at Penn State Institute for Computational and Data Sciences, and a trial attorney with two decades of civil litigation experience in federal and state courts (MD and DC).

Get ahead in a rapidly changing world. Sign up for our daily newsletter. Subscribe
​​