The EFF’s technologist thinks he is not overestimating the power of Big Data. “We may be overestimating how accurate it is, but not the power it has on people’s life. There are states in the US where sentencing guidelines are being set by data: how long someone goes to jail is based on statistics. That’s a lot of power, I would say,” Gillula pointed out.
However, he thinks privacy and Big Data can coexist if certain rules are respected. “First, not all data should be collected. Second, consent is essential. Third, users should encrypt a lot. Fourth, data about people is different than data about floods or other physical systems: people are committing suicide after hacks like the Ashley Madison one,” he said. Computer scientists can do a lot to help. The EFF, for example, has developed Privacy Badger, a tool that blocks third parties’ tracking cookies, and Let’s Encrypt, an easy encryption tool. Above all, scientists can raise awareness. “To the general public anybody working in computer science is a magician, and you can take advantage of that: understand the general problems and speak out,” he concluded.
The standard mechanisms for privacy protection (like the “notice and consent” forms for sharing data online) are grossly ineffective in the times of Big Data. Regulation and policy are required to tackle the issue effectively.
Professor of information technology and public policy at Carnegie Mellon University (CMU). Acquisti is also director of the Peex (Privacy Economics Experiments) lab and the co-director of the CBDR (Center for Behavioral and Decision Research), both at CMU.
People have looked for privacy in all times and places: there is evidence of this in pre-industrialized Java and Bali in the 1950s, in Tuareg communities, in ancient Rome, and in classic Greece. Holy texts like the Quran or the Torah make reference to informational privacy. The very Genesis deals with privacy, when Adam and Eve discover they have private parts.
“Evidence of universal privacy seeking behaviour debunks some of the frames used in the mainstream debate. For example, the idea that the success of social media shows that people don’t really care about privacy. Or the idea that privacy is a modern invention, since we have always lived in a village where everybody knew each other, and the normal state of humankind has always been the lack of privacy,” explained Alessandro Acquisti, professor of information technology and public policy at Carnegie Mellon University (CMU).
Instead, not only humans, but even animals seek privacy: certain apes seek seclusion when mating and cats seek privacy when they are sick. “Why do they do so? Is it for security? That is the point: I believe privacy is related to some need of security. Our ability to manage public and private is a way to manage security,” Acquisti argues.
His work has explored how people make decisions on privacy, and the results ring an alarm. People face great difficulties in handling privacy, so the standard online “notice and consent” mechanisms are grossly inadequate for privacy protection. Regulation and policy are required to tackle the issue, according to Acquisti.
In the first place, privacy decisions are far from purely rational. For example, sensorial input can change them. In an experiment in which volunteers where asked to fill out a form with sensitive questions (eg. a secret sexual phantasy), the tendency to reveal sensitive info changed with inputs like: an actor dressed like a guard walking around the room; a muffled phone conversation in a neighbouring room; or even smelling clove oil. “Offline, we are very good at using sensorial stimuli to negotiate privacy boundaries. But online we lack them: we don’t smell the Facebook and Google that are monitoring us,” said Acquisti.
Even if users were completely rational, still they ignore some of the consequences of their decisions. In an experiment, Acquisti took pictures of students on a campus and used a face recognition software to match them with Facebook profiles: he found the profile of 1 out of 3 subjects. In another experiment, he combined demographic data that people disclose on Facebook with publicly available social security files, to predict statistically the social security number (SSN) of users: for example, identifying the first 5 digits of the SSN for around one fourth of the subjects. “Chain the two experiments together and you can predict SSN from faces!” he points out. Interoperability produces data accretion. “We reveal so many data and analytics is becoming so sophisticated that sensitive information can easily emerge from trivial one,” he pointed out.
Acquisti also questioned that sharing everything will enrich everybody. In a simulation of the interaction between consumers, websites, and ad networks, he showed that when all the consumers’ information is shared, their surplus is zero while all the benefits go to the ad network. “Big Data will make the economic pie larger, but it may also change how it is shared,” he remarked.