The obvious and not-so-obvious data you wouldn’t want companies to have

What types of data are companies collecting, and when does it stop serving us?

companies data

Value exchange: The ultimate differentiator

First, let’s start by assessing the process of giving away data. It’s assumed that in 2020, every person produced approximately 1.7MB of data every second. Without a doubt, a portion of this data is created – and provided – consciously and voluntarily, such as signing up for a newsletter, posting on Instagram, or allowing cookies when browsing your favorite online store.

Voluntary provision is all about value exchange, so it’s key to understand what companies use your data for. Is it to make a product or service better? If yes, providing relevant information might be beneficial. But what if it’s for third-party advertising or data brokerage purposes? Then users would probably expect more value in exchange. When there’s no value, or when the value fails to match the sacrifices we make in the process, that is when the data becomes something we wouldn’t want companies to have.

Apart from voluntarily provided data, there is also the digital data exhaust – the data you generate in the background as you interact with the service, such as cookies, tracking tags, or browser footprint (your IP address, web browser version, and more). While these are fragmented pieces, together they can produce uniqueness and even work to create a more comprehensive digital image of a user. Combining this with voluntary data brings a whole new dimension – profiling data that distinguishes your interests and behaviors. This is where your footprint gets intricate and often spirals way out of control.

It’s one thing when you are conscious about the process. It’s quite another when you’re not aware of what’s going on – that’s when it becomes genuinely critical. You simply can’t be in control when you don’t know what data interactions are happening in the background.

Data is power, and power can be abused

There are a few reasons why you might not want to share your data with companies, including anonymity, privacy intrusion, and ethics. The data points in question include your name, physical and IP address, your contact information, your birthday, and other personal information that could be used by companies for different purposes.

Especially when mixed with other tracking data, opening this data up to third parties could bring about serious consequences: you might have a friend that has mental health issues and want to help them. When doing your research about mental health support, you are tracked and as this information gets sold to insurance companies by data brokers, your entire insurance plan can be at risk. However, it can bring even direr consequences, such as users identified as homosexual based on the content they browse in countries where homosexuality is illegal.

It gets even trickier with different services and apps. For example: 23andMe is a for-profit service conducting DNA genetic testing and analysis for ancestry reports. While the company says it wouldn’t share your DNA data with any third party unless given consent, data shows that more than 80% of users actually opt-in – with the company failing to clearly explain the consequences this could bring. And now that 23andMe plans to go public, many may ask why all of the company’s diverse stakeholders show interest in simple genetic assessment.

If you knew your data would get anonymized and help drive cancer treatment research, you’d likely agree to it, but without clear disclosure, we can assume that’s not the case.

Another example is the notorious Russian-made FaceApp, an application that went viral in 2019, allowing users to upload their selfies and utilize powerful AI to generate different portraits. While many of the privacy concerns have been cleared, Jake Moore, a cybersecurity specialist, warns: “When anything is free, you must always ask yourself what is in it for the owners of the app and how do they make their money?” In the best case scenario, your pictures will be used to teach facial recognition algorithms.

The key lesson is that companies can use any personal data for nefarious purposes, and with complex privacy policies, users are often left with nothing but hope not to have their data misused.

There’s only so much users can do

Often, we believe that a browser’s Incognito Mode can solve our security problems. However, not all browsers will give you the anonymity you desire – in most cases, they can still profile you. Incognito modes were designed for the device, so when you are buying a present for your loved one on a shared computer, it won’t suddenly pop up in the browser history – but that doesn’t mean that Google, Amazon, or other industry players can’t see your activity. Users need to know the right tools and features to leverage, from using the Tor browser to leveraging cloud storage alternatives like Treasure.

However, instead of asking what you can do as a user, it’s key to flip the question around. What can companies do to improve the state of consumer security? In 2021, user trust is essential when growing a company. Businesses need to shift the onus from users to themselves and, by default, offer products and services that respect user privacy.

By adopting privacy by design, processes prioritize privacy in all stages – architecture, design, and building a product. Ultimately, this goes hand-in-hand with full transparency about the information the company collects and with what purpose.

Today, we are far beyond the point of making it a burden on the user to protect their privacy. If companies don’t act, only the most tech-savvy consumers will be well equipped to protect their privacy, excluding the others that don’t have the capability or awareness to act.

Updating privacy policies to be transparent and digestible or incorporating notifications directly in the user interface are great steps. However, these steps are just the beginning; true privacy means the commitment to baking its principles into the core of your product, always.

Don't miss