Michael Chertoff: Big Nanny Is Watching You

Beyond the Internet

But the data explosion raises risks to more than our freedom. The expansion of online networks that are connected to physical systems, and that even control their operation, has dramatically expanded the ability of malign individuals to interfere with the physical world. This affects everything from generating the electricity that powers the grid to the performance of our automobiles. This expansion of network-controlled mechanical systems places an increasing burden on governments, private parties, and ordinary citizens to be able to secure their computers and systems against a surge of attacks from around the world. Traditional rules governing security and liability must adapt to and address these burgeoning threats. And this necessity to protect our world may conflict with the very real concern about the growing collection of our personal information.

One point should be clear. While it is customary to refer to modern big data developments as a result of the internet, that is an oversimplification. These developments were caused by changes in the way we collect, store, transmit, and analyze data, as well as in the interaction between digital transmissions and the operation of control systems that regulate our physical world. As I will describe, a confluence of circumstances drove these changes. Certainly, the creation of the internet was one, driven by the need for a flexible communications system that could survive natural or man-made destruction of the normal methods of communication. Other strides in data collection and analytics are the result of a new national security environment in which threats are no longer nation-states but instead online enemies who can be detected and thwarted only by monitoring the global movement of people, money, and communications. And even more profoundly, data has become valuable as a tool for targeted marketing and as a means of reducing the cost of executing commercial transactions. In short, the data revolution was powered by, and powered, the transformative expansion of our global economy.

Yet these revolutionary changes in the use of data have far outpaced our legal and policy architecture. We want to establish rules of the road to reconcile the competing demands for security, privacy, autonomy, economic growth, and convenience. But as security expert Bruce Schneier has observed, our legal system can be slow to adapt to technological change. We should not try to fit new technologies into the procrustean bed of existing outdated legal doctrines. What we need is to go back to basics: setting forth a clear understanding of the values we want to preserve, what challenges the world of big data presents, and how our legal system should evolve to address those challenges.

The History of Data

To put this effort in context, it’s worth recalling that we have historically recognized the need to restructure our laws and policies when confronted with a technological disruption. We are actually in the third of three transformations in the history of surveillance and data (or information) collection. I call these periods Data 1.0, Data 2.0, and Data 3.0.

Data 1.0 refers to a time when information was collected prior to the invention of automated recording devices, such as cameras, telephones, and tape recorders. After writing was invented, records were limited to handwritten or printed notes or drawings. These were observations of what was seen or heard through face-to-face interaction, or what was read in another handwritten record. The reliability of this material depended upon the communicator’s ability to mentally record and transmit it, either by telling someone else or by writing it down.

But the first transformation in the handling of data came with the invention of the printing press. This allowed broader dissemination of information and the ability to store writings in libraries. Even so, retention, dissemination, and usability of written information were restricted by limitations on storage space, reproduction, and modes of communication.

Data 2.0 refers to the time period after the invention of photography and telephony in the 19th century. Photographs allowed for superior recollection of events through reproduced images. Somewhat more comprehensive visual recording came with the arrival of video. Telephones enabled communication over longer distances more quickly. But telephones and microphones also allowed for deeper access into personal lives from remote locations via wiretaps and electronic surveillance devices, or “bugs.” These technologies made life much more convenient, but at the same time they opened up new methods of surveillance. As I will explain, after a struggle, our laws governing these new data technologies need to evolve to strike a balance between these values.

Data 3.0 is today’s increasingly digital world. Since the 21st century began, photographs are recorded no longer on film but in bytes of information that can be stored indefinitely, copied easily, and transmitted worldwide instantaneously on computers and smartphones and at will. As data storage memory systems improve, more and more data is captured and stored.

Data 3.0 is also characterized by the advent of data analytics—using computer software to examine vast troves of data, reaching conclusions that humans could not reach on their own. Mining data has provided many benefits, while it also enables pernicious uses. All this available data comes with societal consequences. We must learn to manage it in a way that protects individuals while enabling benefits to society as a whole.

And this is unlikely to be the last chapter in the evolution of how we handle data. We can imagine a Data 4.0— already prefigured with modern-day robots and artificial intelligence—in which embedded software in human beings creates true cyborgs: hybrid human machines. The initial steps toward this vision can be seen in the proliferation of wirelessly connected implanted medical devices like insulin pumps and pacemakers, as well as computer-driven limbs used to rehabilitate neurological or other medical deficits.