Opinion
Computing Profession Viewpoint

When Does Law Enforcement’s Demand to Read Your Data Become a Demand to Read Your Mind?

On cryptographic backdoors and prosthetic intelligence.
Posted
  1. Article
  2. Authors
When Does Law Enforcement’s Demand to Read Your Data, illustration

The recent dispute between the FBI and Apple has raised a potent set of questions about companies’ right to design strong cryptographic protections for their customers’ data. The real stakes in these questions are not just whether the security of our devices should be weakened to facilitate FBI investigations, but ultimately, the ability of law enforcement and intelligence agencies to read our minds and most intimate private thoughts.

In the U.S. and other countries, there have been many legal cases in recent years pitting the demands of law enforcement against the concerns of technology companies and privacy advocates over access to new, technologically generated, information about people. The disputed topics have included spy agencies’ bulk collection of Internet traffic and mobile phone metadata; law enforcement use of location-tracking devices, malware, and fake cellphone towers; the constitutionality of "gag orders" that make it a crime for individuals and companies to ever discuss certain requests they receive for others’ data.

In some sense, this is not a new debate; the Fourth Amendment to the U.S. constitution, for instance, has engendered a long history of litigation about the boundaries between types of information that the police can obtain about people simply by demanding it with letters called subpoenas, and information for which a court-issued warrant is necessary. What has changed are the stakes of these disputes.

As the law has operated in the past, almost any information was theoretically fair game for law enforcement to demand if it had probable cause and obtained a warrant. But there was not nearly as much to collect: people did not carry recording and tracking devices with them everywhere, and they did not turn over the most intimate details of their lives to multinational technology companies. There were also legal limits: the private thoughts of defendants were largely protected by rights to remain silent and against self-incrimination—historical legal protections that sprang up as shields against religious persecution. Unfortunately, changes to our lifestyles, to our relationship with technology, and to the very process of human cognition are making these protections so impractical that they may cease to exist at all.

So, what do we mean by changes to the process of human cognition?

Pens and paper are wonderful things. "Hang on. Let me write that down," or "I need a pen and paper to work this out," are the kinds of utterances that reveal our dependence. It is intelligence that makes us human, and a pen and paper magnifies our intelligence.

If you doubt this, consider any reasonable method of measuring intelligence. A human with a pen and paper will perform at least as well as, and often much, much better than, the same human without a pen and paper. So it would be reasonable to state that the pen and paper constituted a prosthetic component of our intelligence, or at least a prosthetic aid for our imperfect memory.

Furthermore, to read someone else’s notes is often described as a window into their mind. Reading someone else’s diary without their permission seems not only to be a violation of privacy but perhaps a form of taboo mind reading.

Now consider the same human having access to Google, Wikipedia, GPS, a calculator, a mobile phone to communicate with friends and colleagues, and indeed the whole Internet. As long as cat videos are not too much of a distraction, this well-resourced human can answer hard questions and perform many difficult tasks much more quickly than people even two decades earlier.

As hunters, weapons were prosthetic claws. As gatherers, baskets were prosthetic arms. After the development of agriculture, horses and plows were huge prosthetic muscles. Later the industrial revolution made us physically strong to a level unimaginable beforehand. And looking back, the invention of writing was the first step on the road to a modern existence built on prosthetic intelligence, one where the states we share through the Internet and the financial system are becoming more important than the biological and physical environment around us.


We have no choice but to pour our minds out if we want to exist and perform at the same level as the humans around us.


But this has come at a complicated price. You can think faster and more accurately, but your electronic devices know where you are, where you have been, who you have talked to, what you said, what your heart rate was at the time, what you have looked at on the Web, what medication you are taking, what you have bought, what maps you have looked up, what spelling mistakes you make, and it is only accelerating. With virtual reality and augmented reality looking imminent, gadgets will begin to log almost every action we take. And we have no choice but to pour our minds out if we want to exist and perform at the same level as the humans around us.

Ignoring arguments about precise definitions of words, it is clear that many humans in the developed world have a lot of their thoughts happening, or at least observable, outside of their brain, and this is only likely to increase in the future. It is through this lens that we need to understand the importance of Apple’s fight to use encryption to protect some (presently very small) portions of its customers’ data so that Apple (and transitively, the FBI) cannot read it. The FBI wants to be able to turn over literally every digital stone in its investigation. But in the era of prosthetic intelligence, that is equivalent to outlawing strong privacy for any corners of the modern human mind.

Where is this heading? Consider a future technological innovation—a brain reader. It is a little device that you attach to your skull that lets someone read your thoughts. This could be a great boon to law enforcement. Trials could be conducted more accurately by reading the thoughts of the defendant. Even better, everyone could be required to daily attend a mind reading to make sure they are not plotting any criminal acts. This would significantly cut down on premeditated crime, making our lives safer. Then we can concentrate on unpremeditated crime. Possibly there are some thoughts that people who are likely to commit unpremeditated crimes might think. We can proscribe those thoughts, and then preemptively arrest people for thought crime. While we are at it, the morality police can put in laws against thinking racist, sexist, extremist, sacrilegious, offensive, or fattening thoughts.

While such an extreme society may have a low crime rate, some people (including us) may think this police state would not actually be a better society to live in. Even ignoring the horrors that would result from imperfect readings, who doesn’t feel guilty about something? As attributed to Cardinal Richelieu, "If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him." Such devices do not exist yet, although the demand has been strong enough that polygraphs, notorious for unreliability, are widely used in the U.S. Other technologies like fMRI are already being used and may turn out to be slightly more accurate than polygraphs, but we are still some distance from having to worry about the societal effects of active mind-reading machines.

What we have instead is a society moving toward prosthetic brains that can be monitored at all times by the state, without the inconvenience of having to have everyone check in each day at the police station. It may feel less invasive to have one’s eye movements recorded by your augmented reality glasses when an attractive member of the opposite sex walks past than to have a daily visit to the mind reader. The former is certainly more convenient than the latter. But practically speaking, the effects are the same.


With access to a vast store of reference information massive deductions can be made.


The available information is not complete, and there will be gaps. But you can inference an awful amount with limited data. Think about how well you know your friends, and how you can often predict what decisions they will make, with only the small view of their world that you get from your interactions with them. With access to a vast store of reference information massive deductions can be made.

Conversely, the possibility of faulty deductions is itself a threat to individuals. You would not want to have performed Internet searches for pressure cookers and backpacks just before the Boston marathon bombings.

Dedicated, well-meaning people in law enforcement naturally want to be able to do their jobs better and make the world a safer, and thus better, place. They see the new data as a boon, and law enforcement agencies select extremely unphotogenic criminals and terrorists as the test cases that will set the rules for millions of other people. Unfortunately, while this surveillance apparatus may occasionally be useful, it also poses a structural threat to democracy.

Even beyond the threat of police states in the Western world and elsewhere, there is a fundamental issue with cryptography that mathematics works the same regardless of whether you are naughty or nice. So if the state can break cryptography then so can other actors. There are obvious direct applications to crime—knowing when someone is away from home; knowing who is worth kidnapping and what their movements are; identity theft, bank fraud, and so forth. But ineffective cryptography also strengthens the black market for industrial espionage—many people would pay to know the thoughts of their competitors, people they are negotiating with, or even people they are considering going on a date with.

Of course the state is not the only institution that wants to read your mind. There is great value to corporations in knowing about you. They collect this data from phone apps and operating systems, credit cards, and web browsers; they use it to help design their products, but also for targeted advertising, differential pricing, and other debatable purposes. People joke, semi-seriously, that Google knows you better than you know yourself. As well as being a threat in their own right, corporations provide an additional target of attack for an intrusive state: as Snowden’s leaks revealed, the NSA didn’t try to track the location of every cellphone on the planet directly: they let advertisements and tracking code in apps collect the data for them.

Ultimately, the question of what to do about the data accumulated by technology companies is different from the question of what to do about the FBI, but it should also be understood that we have largely given these companies the power to read our minds, and might want to find alternatives to that arrangement.

We fear we are slowly moving toward the era of universal mind monitoring without having recognized and considered it in those terms. And those are the terms in which we should understand battles about the right to use effective cryptography. That wonderful gadget in your pocket is not a phone. It is a prosthetic part of your mind—which happens to also be able to make telephone calls. We need to think of it as such, and ask again which parts of our thoughts should be categorically shielded against prying by the state.

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More