Technology

Michael Chertoff: Big Nanny Is Watching You

In a new book, former Homeland Security Director Michael Chertoff argues that unless our legal structures change, the main casualty of the big-data revolution won’t be our privacy—it could be our autonomy. Here’s an excerpt.

James’s eyes pop open, prying his thoughts from slumber. Once again, he has woken up at 5:43 a.m. James always does. The monitor never lets him linger in bed. He sometimes wonders what the early-21st-century “snooze function” might have been like. He has never experienced such a thing but has seen it in a few old movies. In modern 2084, the ideas of the previous century have not been deemed relevant and most of the media has been destroyed.

James has no such luxury. At the optimal awakening time, the monitor, already aware of his sleep phase, begins playing sounds to generate his awakening. The audible portions are supposed to be relaxing. James has chosen beach waves that remind him of his childhood on Cape Cod. Nearly inaudible portions connect with his subconscious, causing his body to begin waking whether he wants to or not.

Today he gets up quickly. Previously, the monitor’s neural scan of James determined that he had been too slow in pulling out of a deep sleep, so it has increased the amount of subliminal communication. James doesn’t know what it would be like to wake up late; the thought is so foreign to his prescribed daily routine that it occurred to him only after he had seen one of those old movies.

After James showers and makes his way toward the kitchen, the monitor presents him with three healthy breakfast options matching his weight, age, and health history. He is glad that he is still young enough to be allowed bacon, and he chooses a breakfast burrito heavy in kale and infused with egg whites. If he eats more than what is presented, the questioning begins. The same thing happens if James refuses to eat. The last time he attempted to skip breakfast, the monitor had detected his failure to accumulate the necessary caloric intake and, since this information was coupled with the fact that his daily bloodwork showed a rise in his white blood cell count, James was deemed too ill to work and was sent to bed.

Entering his travel pod, he begins his commute to the office.

Upon James’s arrival, he is greeted by the monitors stationed outside the building, “Welcome, James Jones. The morning meeting begins at 9:00 in conference room B. Six out of eight attendees have arrived and are stationed in the room. Marcos is 2 minutes and 46 seconds away from arrival.”

“Chipped” at birth, James is accustomed to having his location known and available to others. Initially developed as an expensive and optional parental security feature to ensure that rescue would be quick in case of kidnapping or accident, the chips were eventually demanded by everyone. Mass production and government help have made them affordable. Because of their usefulness in convicting criminals, society has come to accept them. Therefore, anyone who wants to find James can do so. As a by-product of the chip, his life’s history can be played out as a simple series of circular patterns that rarely shift. It isn’t as if he consciously thinks about it. His behavior morphed because he just doesn’t want to be part of the interrogation that inevitably comes if he happens to be in the wrong place at the wrong time. Life is easier if his transit patterns match what is expected.

Although he already knows everyone in the conference room, as James enters, his “eyeglass” implants identify each participant by name. Although this technology is relatively new, James still finds it odd to view the world in “assisted mode.” As he scans the room, an indicator showing each person’s name is tagged in his vision. If he desires, James could probe for more information—his colleagues’ education and work background, intelligence score, family members, and even medical history—by accessing the visual internet database.

After first receiving his eyeglass, James had regularly gone back to review meetings from his colleague Amy’s perspective, hoping he might catch how often she had glanced his way. At first, she was stealing quick looks. She stopped doing this when the monitor flagged her viewing patterns as being irrelevant and a waste of corporate resources. James thinks Amy might be interested in him, but it is too hard to find a legitimate reason to reach out to her. Eventually, he gives up.

Crime rates have fallen tremendously. It is too hard to do something illegal when the crime is almost always captured by either an eyeglass or one of the scanning monitors installed as part of every streetlight. Homeowners installed their own scanning monitors when criminals began to target homes without such devices. The monitors proliferated, and on a vast scale. It wasn’t mandated; it was as if the network spread on its own.

The dramatic drop in crime rates is due to not only the increased surveillance but also the increased ability of the organization to predict bad thoughts, ideas, and ultimately actions. This started as an improvement to the archaic lie-detector testing. Eye movements were first mapped to speech. This data was then processed with behavior recorded by the myriad sensors and video cameras throughout the city. From this data, predictive analytics are able to identify predisposition to erratic and even dangerous behavior.

Thoughts that cross a high negative threshold are automatically reported to the police.

James shakes himself out of his daydream. He isn’t sure if the authorities can piece together his random thoughts into a coherent stream, but he does not doubt for a second that he is being monitored. Unsure of what thoughts might trigger a report to the police, James finds it simpler to focus only on the task at hand. Friends are a distraction, and he always ends up wondering which one of them is an agent. James wonders about just what is, no longer about what might be.

Michael Chertoff

Future/Present

If this scenario seems far-fetched, consider the combination of things already on the market or in development: facial recognition, automated cars, pervasive closed-circuit TV in many cities, and some companies’ use of bird’s-eye cameras overlooking workstations and voluntary (so far) microchips implanted in employees. Of course, the effects of using any one of these devices may be good or bad. Unfortunately, too often government policy- makers, judges, and everyday consumers poorly understand the consequences of the big data revolution.

The effects of big data collection are playing out faster today than ever before. Information sharing has allowed new technologies to be created at an ever-faster pace. Technologies designed for security and classified by governments now quickly find their way into everyday consumers’ hands. The commercial drive to enhance marketing tools also drives relentless innovation in the ability to collect and exploit data. Because today’s information and networks have so many connection points, it is harder and harder to prevent information from leaking. Information doesn’t disappear readily—it sticks. Taken together, these features of modern information technology have sped up the spread of ideas and our personal information.

As an unintended by-product, however, growing inter- connectivity has had the effect of dramatically increasing threats to our security and privacy. The proliferation of wirelessly connected devices—often mobile—expands the surface area of network entry points through which hackers can penetrate our information and communication net- works. By the same token, the centralized collection of our personal data by government and corporations means it is far easier for hackers to steal that data at a huge scale. So, consider the following recent cyber data threats: Equifax, the credit agency, loses data pertaining to 143 million Americans; Yahoo has 3 billon users’ accounts compromised; and the U.S. Office of Personnel Management, the government’s human resources agency, has highly sensitive security les relating to over 25 million employees and applicants stolen, perhaps by a foreign nation.

History does show that technological changes bring with them social and normative changes, allowing societies to adapt. So, the development of the automobile led to the adoption of safety requirements and the regulation of traffic patterns. Because in modern democracies people ultimately define the rules that determine or restrict their behavior—the social contract—the rules must adjust to meet the needs of the day. But new technology doesn’t always fit within the existing social construct. Trying to force it into an outdated legal system may even break the system. Eventually people react by demanding fundamental changes to the rules. It falls to elected of officials, administrators, and courts to recognize changed circumstances and then reconstruct legal and policy standards.

When We Lose Control

As the social contract is renegotiated, a return to basic principles and values is necessary. Standing outside the outmoded paradigms and automated legal categories, we must re-determine what our core social and ethical values are. What’s in danger and what needs protection? Often the constitutional principles of liberty, security, freedom of expression and association, and independence must be weighed against each other, possibly with the interests of society balanced against the rights and interests of the individual.

The rise of big data capabilities is often critiqued from the standpoint of loss of privacy. But when technologies collect, catalog, and exploit data—much of which is willingly submitted by people—or when data is collected in open public spaces, then privacy is too narrow a concept to reflect what may be at risk.

What is actually at stake is the freedom to make the personal choices that affect our values and our destiny. A person can be manipulated and coerced in many ways, but the most ominous involve the pressure that comes with constant, ongoing surveillance of our actions. Our parents shape our behavior not only by teaching us as children, but also by the examples they set. They hope to instill strong value systems in their children even as they hope that their children will gain new opportunities, ideas, and experiences to mold them. As we grow older, we have more and more opportunities to choose our own way and explore new ideas.

But that freedom can be undermined when we lose control of information about ourselves—our actions, beliefs, relationships, and even our flaws and mistakes.

Modern analytic tools have the potential to form a detailed picture of almost any individual’s activities. It is extremely difficult today to “opt out” of the data stream. Modern life generates data as a necessary part of the convenient services we enjoy. Information collected today is necessarily broader than what was collected in years past; it lasts longer; and it is put to more uses. But those who collect and aggregate that data have an increased power to influence and even coerce our behavior—possibly through social shaming and financial incentives and penalties.

Today’s explosion of big data is often justified as promoting healthy lifestyles, convenient marketing, and even easier and more informed political engagement. But ubiquitous surveillance is a classic tool of oppression as epitomized by the Big Brother of George Orwell’s 1984, which watches constantly. Are we on the verge of inviting this oppression surveillance into our own lives, albeit in the deceptively benign guise of a “Big Nanny” who watches over us “for our own good”?

Beyond the Internet

But the data explosion raises risks to more than our freedom. The expansion of online networks that are connected to physical systems, and that even control their operation, has dramatically expanded the ability of malign individuals to interfere with the physical world. This affects everything from generating the electricity that powers the grid to the performance of our automobiles. This expansion of network-controlled mechanical systems places an increasing burden on governments, private parties, and ordinary citizens to be able to secure their computers and systems against a surge of attacks from around the world. Traditional rules governing security and liability must adapt to and address these burgeoning threats. And this necessity to protect our world may conflict with the very real concern about the growing collection of our personal information.

One point should be clear. While it is customary to refer to modern big data developments as a result of the internet, that is an oversimplification. These developments were caused by changes in the way we collect, store, transmit, and analyze data, as well as in the interaction between digital transmissions and the operation of control systems that regulate our physical world. As I will describe, a confluence of circumstances drove these changes. Certainly, the creation of the internet was one, driven by the need for a flexible communications system that could survive natural or man-made destruction of the normal methods of communication. Other strides in data collection and analytics are the result of a new national security environment in which threats are no longer nation-states but instead online enemies who can be detected and thwarted only by monitoring the global movement of people, money, and communications. And even more profoundly, data has become valuable as a tool for targeted marketing and as a means of reducing the cost of executing commercial transactions. In short, the data revolution was powered by, and powered, the transformative expansion of our global economy.

Yet these revolutionary changes in the use of data have far outpaced our legal and policy architecture. We want to establish rules of the road to reconcile the competing demands for security, privacy, autonomy, economic growth, and convenience. But as security expert Bruce Schneier has observed, our legal system can be slow to adapt to technological change. We should not try to fit new technologies into the procrustean bed of existing outdated legal doctrines. What we need is to go back to basics: setting forth a clear understanding of the values we want to preserve, what challenges the world of big data presents, and how our legal system should evolve to address those challenges.

The History of Data

To put this effort in context, it’s worth recalling that we have historically recognized the need to restructure our laws and policies when confronted with a technological disruption. We are actually in the third of three transformations in the history of surveillance and data (or information) collection. I call these periods Data 1.0, Data 2.0, and Data 3.0.

Data 1.0 refers to a time when information was collected prior to the invention of automated recording devices, such as cameras, telephones, and tape recorders. After writing was invented, records were limited to handwritten or printed notes or drawings. These were observations of what was seen or heard through face-to-face interaction, or what was read in another handwritten record. The reliability of this material depended upon the communicator’s ability to mentally record and transmit it, either by telling someone else or by writing it down.

But the first transformation in the handling of data came with the invention of the printing press. This allowed broader dissemination of information and the ability to store writings in libraries. Even so, retention, dissemination, and usability of written information were restricted by limitations on storage space, reproduction, and modes of communication.

Data 2.0 refers to the time period after the invention of photography and telephony in the 19th century. Photographs allowed for superior recollection of events through reproduced images. Somewhat more comprehensive visual recording came with the arrival of video. Telephones enabled communication over longer distances more quickly. But telephones and microphones also allowed for deeper access into personal lives from remote locations via wiretaps and electronic surveillance devices, or “bugs.” These technologies made life much more convenient, but at the same time they opened up new methods of surveillance. As I will explain, after a struggle, our laws governing these new data technologies need to evolve to strike a balance between these values.

Data 3.0 is today’s increasingly digital world. Since the 21st century began, photographs are recorded no longer on film but in bytes of information that can be stored indefinitely, copied easily, and transmitted worldwide instantaneously on computers and smartphones and at will. As data storage memory systems improve, more and more data is captured and stored.

Data 3.0 is also characterized by the advent of data analytics—using computer software to examine vast troves of data, reaching conclusions that humans could not reach on their own. Mining data has provided many benefits, while it also enables pernicious uses. All this available data comes with societal consequences. We must learn to manage it in a way that protects individuals while enabling benefits to society as a whole.

And this is unlikely to be the last chapter in the evolution of how we handle data. We can imagine a Data 4.0— already prefigured with modern-day robots and artificial intelligence—in which embedded software in human beings creates true cyborgs: hybrid human machines. The initial steps toward this vision can be seen in the proliferation of wirelessly connected implanted medical devices like insulin pumps and pacemakers, as well as computer-driven limbs used to rehabilitate neurological or other medical deficits.

What Will We Protect?

In thinking about how the law has evolved and should evolve, it’s fundamental to clarify what values we want to preserve and rebalance. In the world of Data 1.0 the law primarily protected a privacy interest in physical spaces through property rights. At the time of the signing of the U.S. Constitution, the rule was that judicial permission was needed to search and seize in private physical space. When Data 2.0 arrived, technology like the telegraph and telephone shrank physical distances between spaces. The law adapted to protect a privacy interest in personal conversations, requiring a warrant to intercept conversations, even over publicly located telephone wires.

Today, Data 3.0 technology is again changing what is at stake. The sheer amount of personal data recorded, stored, and analyzed is staggering. New technologies have further blurred the line between what is public and what is private, and information once collected does not readily disappear. While we will always be concerned about our privacy in physical spaces and communications, the advent of mass collection, storage, analysis, and distribution of personal data means that we must also consider how we control data generated by or about us, even if it was not collected in what we might generally consider personal physical space.

What should we protect? Privacy is too narrow a value: it covers concealing only behavior that is sheltered from others in a private space or on privately designated communications facilities. In a world irreversibly governed by ubiquitous Data 3.0, hiding or obscuring behavior is impossible.

I argue that what we can and should care about is the broader value of autonomy, which is at the very core of freedom. Autonomy is the ability to make our own personal choices, restricted only by transparent laws and also influenced by social norms affecting our reputations within our communities. Autonomy is fundamental to human nature and respected in a modern, democratic society. Under the democratic ideal and the rule of law, citizens are bound only by law and regulations openly and democratically adopted and objectively enforced. Less coercively, our conduct is affected by norms honored within our own civil society institutions and communities.

These principles—the essence of the rule of law—are eroded when the availability of ubiquitous personal data means that any data holder (official or private) can use psychological manipulation, shaming, and financial incentives and penalties to in influence and possibly coerce almost every facet of human behavior: what we watch, see, and eat; how we behave; and to whom we relate. When government has total access to your personal information, the practical reach of its authority is almost limitless, transgressing the formal constitutional or statutory limits on of official power. Just as alarming, unfettered access to that data can allow private enterprises or groups to pressure, manipulate, or incentivize personal behavior without any public accountability. As is illustrated by the increasing phenomenon of online bullying, communities with which we have no connection can use data about us to retaliate or annoy us if they don’t like our political or even aesthetic opinions.

To preserve space for lawful personal choice means that we must have significant control over data about ourselves— our likenesses, the things we do, our thought processes and decisions. A world in which every step we take factors into auto insurance or marketing, or allows the government to predict and regulate our behavior, would be a substantial constraint on our freedom of belief, our relationships, and our actions. Essentially, it means we would become programmed. We are moving in that direction.

We have always been worried that Big Brother might force his way into our home and compel obedience under his watchful eye. But Big Brother need not beat down the door. We are currently rolling out the red carpet to welcome him. And Big Brother is not just the government but also foreign nations, organized criminals, and even private companies. Indeed, as we incessantly record one another, we become Little Brothers and Sisters.

At a fundamental level, people should be aware of what is being done with their data, and they should make a choice about how to deal with it. Put another way, protection of our way of life must move beyond a right to conceal our data and into a broader right to control our data, even when hiding it or privately maintaining it becomes technologically impossible. Unless we take stock of our new digital environment and its consequences, we may lose not just privacy but also freedom and autonomy in the name of convenience.

Read more: The CEO’s Role in Driving Big Data Strategies


Michael Chertoff

Michael Chertoff served as secretary of the U.S. Department of Homeland Security from 2005 to 2009. A former judge, he is co-founder and executive chairman of the Chertoff Group.

Share
Published by
Michael Chertoff

Recent Posts

Marshall Goldsmith: Before Speaking, Ask ‘Is It Worth It?’

What you say matters—and that’s not always a good thing.

3 hours ago

Tech-Savvy CFOs Reveal How To Spend Wisely

Which technologies have captured the interest of CFOs immersed in the tech industry, and how…

3 hours ago

The Fallacy of Waiting: PE’s Overestimation Of Interest Rate Cut Impact

With or without the psychological boost of an interest rate cut, PE investors need to…

4 hours ago

Guild CTO Rohan Chandran Makes His Own Momentum 

In this edition of our Corporate Competitor Podcast, Chandran shares how leaders can tap into…

4 hours ago

CEO Optimism Weakens In July

America’s CEOs are reforecasting their outlook for the year ahead, as consumer demand begins to…

1 day ago

Xpel Balances Customer Responsiveness With Manufacturing Scale

CEO Pape has built markets by contracting output but believes it might be time for…

4 days ago