How Facebook and Google nudge users to make anti-privacy choices

Facebook, Google and Microsoft use design techniques and tricks to steer users toward sharing more information about themselves to benefit those businesses, the Norwegian Consumer Council (NCC) has shown.

Among these so-called “dark patterns” are anti-privacy default settings, confusing layouts, illusions of choice, and design choices (positioning, visual cues, etc.).

anti-privacy dark patterns

Privacy intrusive defaults

“Facebook and Google have privacy intrusive defaults, where users who want the privacy friendly option have to go through a significantly longer process. They even obscure some of these settings so that the user cannot know that the more privacy intrusive option was preselected,” the NCC noted.

“The popups from Facebook, Google and Windows 10 have design, symbols and wording that nudge users away from the privacy friendly choices. Choices are worded to compel users to make certain choices, while key information is omitted or downplayed. None of them lets the user freely postpone decisions. Also, Facebook and Google threaten users with loss of functionality or deletion of the user account if the user does not choose the privacy intrusive option.”

anti-privacy dark patterns

If you’re wondering how these dark patterns look in practice, check out NCC’s detailed report.

Anti-privacy dark patterns and GDPR

Digital services providers’ use of these design techniques is arguably unethical, the NCC added, but they also might be falling afoul of EU’s General Data Protection Regulation.

“Data protection law requires that companies make it easier for users to make clear and informed choices, and that they let users take control of their own personal data. Unfortunately, this is not the case, which is at odds with the expectations of consumers and the intention of the new Regulation,” noted Finn Myrstad, director of digital services in the Norwegian Consumer Council.

So, they’ve asked the Norwegian Data Protection Authority and the Norwegian Consumer Agency to investigate whether the companies are acting in accordance with data protection principles in the GDPR, (i.e., data protection by design and by default).

Princeton professor Arvind Narayanan pointed out that, while the report examines three companies, dark patterns have become pervasive and institutionalized.

“In ongoing research, [Princeton grad Arunesh Mathur] has found that designers trade strategies and templates for dark patterns on message boards, treating the GDPR as a nuisance to work around. Left unchecked, dark patterns will negate much of the benefit of the GDPR and other consumer protection laws,” he added.

Don't miss