Yves Le Roux is the Technology Strategist at CA Technologies and Chair of ISACA’s Data Privacy Task Force. In this interview he discusses the evolution of the digital identity, the influence of politics on privacy, Google Glass, and much more.
What are the critical issues in understanding the very nature of identity in a society actively building bridges between the real and digital world?
If you speak to a psychologist, he/she will explain you that each individual integrate various aspects of identity, memory and consciousness in a single multidimensional self. As said in a study done by Cabiria (2008), “The structure and design of virtual worlds allows its users to freely explore many facets of their personalities in ways that are not easily available to them in real life”. But this may have some consequences. For example, if an individual creates a virtual identity that is different from their real life identity, it can take a lot of psychological effort to maintain the false identity. In addition, one of the two options will occur, the identities may converge into one, making the virtual and real identities truer, or the individual may simply toss out the virtual identity, and start over with a new one.
The main issue with identities in this virtual world is trust. Law enforcement officials view this possibility of multiple untrusted identities as an open invitation to criminals who wish to disguise their identities. Therefore, they call for an identity management infrastructure that would irrevocably tie online identity to a person’s legal identity.
A popular opinion among politicians is: “If you have nothing to hide, you have nothing to worry about”. Why is privacy still important, even if you have nothing to hide?
The line “if you’ve got nothing to hide, you have nothing to worry about” is used all too often in defending surveillance overreach. It’s been debunked countless times in the past. For example, in 2007,in a short essay, written for a symposium in the San Diego Law Review, Professor Daniel Solove (George Washington University Law School) examines the nothing to hide argument.
His conclusion was: “The nothing to hide argument speaks to some problems, but not to others. It represents a singular and narrow way of conceiving of privacy, and it wins by excluding consideration of the other problems often raised in government surveillance and data mining programs. When engaged with directly, the nothing to hide argument can ensnare, for it forces the debate to focus on its narrow understanding of privacy. But when confronted with the plurality of privacy problems implicated by government data collection and use beyond surveillance and disclosure, the nothing to hide argument, in the end, has nothing to say.”
In our privacy study following a European paper issued by Michael Friedewald, we distinguish seven types of privacy:
1. Privacy of the person encompasses the right to keep body functions and body characteristics (such as genetic codes and biometrics) private.
2. Privacy of behaviour and action includes sensitive issues such as sexual preferences and habits, political activities and religious practices.
3. Privacy of communication aims to avoid the interception of communications, including mail interception, the use of bugs, directional microphones, telephone or wireless communication interception or recording and access to e-mail messages.
4. Privacy of data and image includes concerns about making sure that individuals’ data is not automatically available to other individuals and organisations and that people can “exercise a substantial degree of control over that data and its use”.
5. Privacy of thoughts and feelings. People have a right not to share their thoughts or feelings or to have those thoughts or feeling revealed. Individuals should have the right to think whatever they like.
6. Privacy of location and space, individuals have the right to move about in public or semi-public space without being identified, tracked or monitored.
7. Privacy of association (including group privacy), is concerned with people’s right to associate with whomever they wish, without being monitored.
Considering the full spectrum of privacy, are you sure you have nothing to hide? For example, do you want that people knows where you spend your time — and, when aggregated with others, who you like to spend it with? If you called a substance abuse counselor, a suicide hotline, a divorce lawyer or an abortion provider? What websites do you read daily? What porn turns you on? What religious and political groups are you a member of?
How has privacy evolved in the digital world? What are users still doing wrong?
Internet is a worldwide network and everything must be developed for a global environment (without national borders). Cloud computing delivery models require the cross-jurisdictional exchange of personal data to function at optimal levels.
In January 2011, the World Economic Forum (WEF) issued a publication entitled: “Personal Data: The Emergence of a New Asset Class”. In this document, the WEF highlighted the differences of Privacy-related laws and police enforcement across jurisdictions, often based on cultural, political and historical contexts and that attempts to align such policies have largely failed. For the WEF, the key to unlocking the full potential of data lies in creating equilibrium among the various stakeholders influencing the personal data ecosystem. A lack of balance between stakeholder interests – business, government and individuals – can destabilize the personal data ecosystem in a way that erodes rather than creates value.
Furthermore, the service provider may change this policy. Everybody remembers the Instagram case. In December 2012. Instagram said that it has the perpetual right to sell users’ photographs including for advertising purposes without payment or notification. Due to the strong reaction, Instagram has backed down.
Many consumers are poorly educated about how their personal data is collected by companies and are unsure about what it is actually used for. Investigation into the recent implementation of the EU Cookie Law has highlighted how misinformed consumers in Europe currently are. For example, 81 percent of people who delete cookies do not distinguish between the “first-party’ cookies that give a website its basic functionality (e.g., remembering what items the consumer has placed in their shopping basket) and the “third-party’ cookies that advertisers place on websites to track user viewing. At the same time, 14 percent said they thought the data used to show them relevant ads included information that could identify them personally, while 43 percent were not sure if this meant their identity was known.
With wearable recording devices such as Google Glass getting traction, we are opening ourselves for a new type of privacy invasion. Do you see people embracing such technologies en masse or can we expect them to question those that do?
Google Glass is essentially a phone in front of your eyes with a front-facing camera. A heads-up display with facial recognition and eye-tracking technology can show icons or stats hovering above people you recognize, give directions as you walk, and take video from your point of view.
In July 2013, Google has published a new, more extensive FAQ on Google Glass. There are nine questions and answers listed under a section named Glass Security & Privacy, with several concentrating on the spec’s camera and video functionality.
But this doesn’t solve others privacy concerns:
- Google Glass tracks your eye movements and makes data requests based on where you’re looking. This means the device collects information without active permission. Eye movements are largely unconscious and have significant psychological meanings. For example, eye movements show who you’re attracted to and how you weigh your purchase options when shopping.
- How many of you will turn off your Glass while punching in your PIN? How about when a person’s credit card is visible from the edge of your vision? How about when opening your bills, filing out tax information, or filing out a health form? Remember that computers can recognize numbers and letters blazingly fast – even a passing glance as you walk past a stranger’s wallet can mean that the device on your face learns her credit card number. All of this information can be compromised with a security breach, revealing both the information of the one using Glass and the people they surrounds themselves with.
- On July 4th 2013, Chris Barrett, a documentary filmmaker, was wearing Glass for a fireworks show in Wildwood, N.J., when he happened upon a boardwalk brawl and subsequent arrest. The fact the glasses were relatively unnoticeable made a big difference: “I think if I had a bigger camera there, the kid would probably have punched me,” Barrett told. The hands-free aspect of using Glass to record a scene made a big difference.
In your opinion, which will be the major threats to privacy in the near future?
Privacy is entering a time of flux, and social norms and legal systems are trying to catch up with the changes that digital technology has brought about. Privacy is a complex construct, influenced by many factors, and it can be difficult to future-proof business plans so they keep up with evolving technological developments and consumer expectations about the topic.
One way to ensure there are no surprises around privacy is by seeing it not as a right, but rather as an exchange between people and organizations that is bound by the same principles of trust that facilitate effective social and business relationships. This is an alternative to the approach of “privacy as right,’ that instead positions privacy as a social construct to be explicitly negotiated so that it is appropriate to the social context within which the exchange takes place.
The lengthy privacy policies, thick with legalese that most services use now will never go away, but better controls will probably emerge. Whatever the tools are used to protect and collect personal data in the future, it will be important for companies like Facebook and Google to educate their consumers and to provide them with options for all levels of privacy.
Yves will be addressing these issues and others at the 2013 European Computer Audit, Control and Security (EuroCACS) / Information Security and Risk Management (ISRM) conference that will take place at Hilton London Metropole on the 16th – 18th September 2013.