Does analyzing employee emails run afoul of the GDPR?

A desire to remain compliant with the European Union’s General Data Protection Regulation (GDPR) and other privacy laws has made HR leaders wary of any new technology that digs too deeply into employee emails. This is understandable, as GDPR non-compliance pay lead to stiff penalties.

analyzing employee emails

At the same time, new technologies are applying artificial intelligence (AI) and machine learning (ML) to solve HR problems like analyzing employee data to help with hiring, completing performance reviews or tracking employee engagement. This has great potential for helping businesses coach and empower employees (and thus help them retain top talent), but these tools often analyze employee emails as a data source. Does this create a privacy issue in regard to the GDPR?

In most cases, the answer is “no.” Let’s explore these misconceptions and explain how companies can stay compliant with global privacy laws while still using AI/ML workplace technologies to provide coaching and empowerment solutions to their employees.

Analyzing employee data with AI/ML isn’t unique to HR

First of all, many appliances already analyze digital messages with AI/ML. Many of these are likely already used by your organization and do not ask for consent from every sender for every message they analyze. Antivirus software uses AI/ML to scan incoming messages for viruses, chatbots use it to answer support emails, and email clients themselves use AI/ML to suggest responses to common questions as the user types them or create prompts to schedule meetings.

Applications like Gmail, Office 365 Scheduler, ZenDesk and Norton Antivirus do these tasks all the time. Office 365 Scheduler even analyzes emails using natural language processing to streamline the simple task of scheduling a meeting. Imagine if they had to ask for the user’s permission every time they did this! HR technologies that do something similar are not unique.

Employers also process employee’s personal data without their consent on a daily basis. Consider these tasks: automatically storing employee communications, creating paperwork for employee reviews or disciplinary action, or sending payroll information to government agencies. Employees don’t need to give consent for this. That’s because there’s a different legal basis at work that allows the company to share data in this way.

Companies do not need employee consent in this context

This isn’t an issue because the GDPR offers five alternative legal bases pursuant to which employee personal data can be processed, including the pursuit of the employer’s “legitimate interests.” This concept is intentionally broad and gives organizations flexibility to determine whether its interests are appropriate, regardless of whether these interests are commercial, individual, or broader societal benefits, or even whether the interests are a company’s own or those of a third party.

GDPR regulations single out preventing fraud and direct marketing as two specific purposes where personal data may be processed in pursuit of legitimate interest, but there are many more.

These “legitimate interest” bases give employers grounds to process personal data using AI/ML applications without requiring consent. In fact, employers should avoid relying on consent to process employee’s personal data whenever possible. Employees are almost never in a position to voluntarily or freely give consent due to the imbalance of power inherent in employer-employee relationships, and therefore the consents are often invalid. In all the cases listed above, the employer relies on legitimate interest to process employee data. HR tools fall into the same category and don’t require consent.

A right to control your inbox

We’ve established that employers can process email communication data internally with new HR tools that use AI/ML and be compliant with the GDPR. But should they?

Here is where we move from legal issues to ethical issues. Some companies that value privacy might believe that employees should control their own inbox, even though that’s not a GDPR requirement. That means letting employees grant and revoke permission to the applications that can read their workplace emails (and which have already been approved by the company). This lets the individual control their own data. Other organizations may value the benefits of new tools over employee privacy and may put them in place without employees’ consent.

I have seen some organizations create a middle ground by making these tools available to employees but requiring them to opt in to use them (rather than installing them and giving employees the option to opt out, which puts an extra burden on them to maintain privacy). This can both respect employee’s privacy and allow HR departments to use new technologies to empower individuals if they so choose. This is more important than ever in the new era of widespread work from home where we have an abundance of workplace communication and companies are charting new courses to help their employees thrive in the future of work.

Fully understanding compliance around new AI/ML tools is key to effectively rolling them out. While these solutions can be powerful and may help your employees become more self-aware and better leaders, organizations should fully understand compliance and privacy issues associated with their use in order to roll them out effectively.

Don't miss