Is privacy becoming a luxury? A candid look at consumer data use
In this Help Net Security interview, Dr. Joy Wu, Assistant Professor, UBC Sauder School of Business, discusses the psychological and societal impacts of data monetization, why current privacy disclosures often fall short, and what it will take to create a more equitable data ecosystem. From the limits of transparency to the potential for privacy to become a luxury, Wu offers a candid assessment of where we are, and where we’re headed.
Consumers significantly change their behavior once they’re told their data will be monetized. What does that tell us about the effectiveness of current privacy disclosures?
Data monetization is widespread in digital markets, and people care if this is happening to their personal data. If privacy disclosures don’t disclose data monetization, then I think they miss something essential about why people like having privacy. For example, disclosing whether personal information is being collected differs from disclosing how it’s being used. People think of these two things differently, and both determine a person’s choice to share data.
Informed consumers demand more. Could this lead to a future where privacy becomes a luxury good, something only those with resources can protect?
Yes, this is possible without the right policies or data ecosystem, and the burden is on consumers to protect their own privacy in their daily lives. Not everyone has the time and energy to stay informed about data-sharing consequences. And, not everyone can afford privacy-enhancing products and services. These issues lead to important questions about inequality in who gets to have privacy and who is sharing data. This is also consequential for industries that rely on consumer data, because it raises questions about which populations of people are over- or under-represented in prediction algorithms.
Do you believe transparency alone is enough, or is there a need for systemic change in how data ecosystems operate?
Transparency alone is agnostic to how personal data are collected and managed. What privacy options consumers prefer, weighed against the quality of the products and services exchanged for data sharing, is what firms are challenged with figuring out. People demand privacy, and the more we understand what people like about privacy, the better firms can improve their products and services to meet these preferences. Ultimately, firms can differentiate themselves by how they manage and use consumer data, which can certainly complement transparency initiatives.
But, we should also consider the limits of transparency solutions more generally—not just whether they are being transparent about things consumers care about. Once transparency is provided, then it is on the user to make the effort to incorporate all the relevant privacy consequences into their decisions. There are limitations to relying on this. It doesn’t entirely make sense for a person to calculate every potential privacy consequence before every data-sharing decision. Consumers make too many decisions online to keep up with opt-in and opt-out choices for every action. In addition, there are many situations where people are choosing among alternatives with no differentiated privacy options.
Finally, there are other forces that can shape commercial use of personal data. On the regulation side, these include policies on how firms can use data, consumer rights over the data that others have about them, and even anti-trust issues in digital industries. On the industry side, there is potential for the rise in new services and technology solutions that enhance consumer privacy, which people are willing to pay for.
How do people perceive the fairness or ethics of secondary data monetization?
I have some theories for why people don’t like others monetizing their data, and fairness and ethics are among the top. People may think it’s unfair that others profit from their data, which means consumers want to be paid data dividends. People may also, on some level, find it unethical that their personal data can be monetized by others at all.
If we continue down the current path of opaque data monetization, what do you foresee as the consumer or societal backlash?
People are becoming more digitally informed every day, and they will not remain inattentive forever about the consequences they care about. Eventually, people become aware if (1) they’ve lost more privacy than they believe they consented to, or (2) they believe they were underinformed when they consented. This can lead to backlash. Societal backlash can certainly motivate regulations around consumer protection and privacy rights.
I would also not discount the ability for people to adapt their behavior over time, such as switching from prior and less-informed online habits. So, eventually, people will become increasingly aware of how their data can be traded like an asset in secondary markets. If this phenomenon continues, people continue to dislike it, and they are not receiving greater benefits in exchange for it, then they can learn to engage with digital markets in a far more selective or restrained way to maintain their privacy.