In November 2017, online fitness tracker Strava published a heatmap of the activity many of its users around the world engage in (and track) daily. But what might have seemed as a harmless sharing of anonymized, aggregated data turned out to reveal potentially sensitive information about (mostly western) military bases and secret sites.
The revelation was made and shared over the weekend by Nathan Ruser, an Australian university student and founding member of Institute for United Conflict Analysts, a grassroots intelligence organization.
Strava released their global heatmap. 13 trillion GPS points from their users (turning off data sharing is an option). https://t.co/hA6jcxfBQI … It looks very pretty, but not amazing for Op-Sec. US Bases are clearly identifiable and mappable pic.twitter.com/rBgGnOzasq
— Nathan Ruser (@Nrg8000) January 27, 2018
He pointed out that soldiers and security personnel obviously use the app during their daily exercise and patrols.
The activity information, shared with Strava, ended up showing in the heatmap and could provide potential attackers with helpful information about the “pattern of life” in and around military bases, intelligence secret sites, and training facilities around the world.
Strava has pointed out that the activities marked as private by users have not been included in the heatmap, and that activities have been cropped to respect user defined privacy zones. “Athletes with the Metro/heatmap opt-out privacy setting have all data excluded,” they also added.
Unfortunately, some users didn’t think about making their data private.
And, as some noted, the information collected by Strava, some tools provided by it, and information provided by the users themselves on public profiles can be scraped, and used to target them:
Okay here is where things get problematic: Via Strava, using pre-set segments we can scrape location specific user data from basically public profiles (and yes those exist w/in bases and lead us straight so social media profile of service members). https://t.co/VDNBGcKvIY
— Tobias Schneider (@tobiaschneider) January 29, 2018
Security and privacy researcher Lukasz Olejnik pointed out that anonymising location and fitness data is challenging and should always be considered on many different levels before publishing even aggregated data.
While admitting that it could be a daunting prospect, he noted that a privacy impact assessment should be a must when publishing any big user dataset.