The lone hacker used to be the stereotypical threat to national security: think of Kevin Mitnick, who prosecutors claimed could have started a nuclear war by whistling into a phone. Then it was the citizen activist, whistleblowers (perhaps affiliated to WikiLeaks). Then it was the state-sponsored group – perhaps Chinese, perhaps North Korean, perhaps Russian. The Stuxnet virus (which damaged equipment in Iran, and which most likely was a result of US-Israeli teamwork) showed that the digital world could have a very real impact on the physical, putting a whole new urgency on the need to keep computer security levels as high as possible.
So it seems almost anticlimactic that it was an app for tracking where users were running that has undone so much secrecy for the US military in particular. It’s a lesson that perhaps the greatest threat to operational security lies in the treasure trove of data which we often unwittingly produce – and which private companies, in jurisdictions with limited governmental oversight, often think little about.
The case of the app, Strava, is almost farcical: the company decided to publish heat maps of its users to show off it the success and got more than it bargained for when the visualization effectively gave detailed maps of routes in military bases around the world. Much of that data may be of limited use to opposing nations or non-state actors (particularly where Strava doesn’t seem to have been so heavily used). On the other hand, in spaces where the heat maps are bright, the information visualization essentially sketches out a handy blueprint for troop movement. At its worst, it highlights locations where military forces were not known to be. It is, to put it lightly, a fairly horrifying outcome for the US armed forces, since Strava is far less heavily used by opponents (whether they use different apps or simply lack activity counters remains to be seen).
Strava, in a move either impossibly brave or impossibly foolhardy, at first went on the defensive and suggested that military personnel should have opted out. There is a nice logic to this for the company (who have kept the heat maps up on their site), since it puts the onus onto the consumer. This is the stance taken by social media platforms like Facebook, whenever an embarrassing event has come to light: you should have put your privacy settings higher!
This is also a neat effort to glide over the conflict of interest in this argument: lower privacy settings equal greater data collection, which means more information to sell on to other companies. By putting privacy as low as possible by default, and by ensuring that changing this is not a simple process, companies like Strava get to have their cake and eat it. The very fact that Strava has now offered to change how its privacy settings work should not be read as a company owning up to its mistakes: it’s a PR move and volte-face as part of an attempt to cover up its own shortcomings.
Designs are not just confusing because of incompetent programmers – in fact, the opposite can be quite true. For too long Big Tech has gotten away with hoovering up Big Data, before belated fixes which do little to help with all the information collected. This is only going to get worse as we head further into the age of the Internet of Things, where privacy policies will be increasingly obscure, and opting out of a small, screenless device will be practically impossible. Whether its consumer pressure or a governmental crackdown (such as a heavy enforcement of the GDPR on the most egregious offenders), the Strava story is just another case of Big Tech’s disregard for all of us except in how much our data sells for.