The Strava heat maps are a grim reminder of Big Tech’s power

Reading Time: 3 minutes

The lone hacker used to be the stereotypical threat to national security: think of Kevin Mitnick, who prosecutors claimed could have started a nuclear war by whistling into a phone. Then it was the citizen activist, whistleblowers (perhaps affiliated to WikiLeaks). Then it was the state-sponsored group – perhaps Chinese, perhaps North Korean, perhaps Russian. The Stuxnet virus (which damaged equipment in Iran, and which most likely was a result of US-Israeli teamwork) showed that the digital world could have a very real impact on the physical, putting a whole new urgency on the need to keep computer security levels as high as possible.

So it seems almost anticlimactic that it was an app for tracking where users were running that has undone so much secrecy for the US military in particular. It’s a lesson that perhaps the greatest threat to operational security lies in the treasure trove of data which we often unwittingly produce – and which private companies, in jurisdictions with limited governmental oversight, often think little about.

The case of the app, Strava, is almost farcical: the company decided to publish heat maps of its users to show off it the success and got more than it bargained for when the visualization effectively gave detailed maps of routes in military bases around the world. Much of that data may be of limited use to opposing nations or non-state actors (particularly where Strava doesn’t seem to have been so heavily used). On the other hand, in spaces where the heat maps are bright, the information visualization essentially sketches out a handy blueprint for troop movement. At its worst, it highlights locations where military forces were not known to be. It is, to put it lightly, a fairly horrifying outcome for the US armed forces, since Strava is far less heavily used by opponents (whether they use different apps or simply lack activity counters remains to be seen).

Strava, in a move either impossibly brave or impossibly foolhardy, at first went on the defensive and suggested that military personnel should have opted out. There is a nice logic to this for the company (who have kept the heat maps up on their site), since it puts the onus onto the consumer. This is the stance taken by social media platforms like Facebook, whenever an embarrassing event has come to light: you should have put your privacy settings higher!

This is also a neat effort to glide over the conflict of interest in this argument: lower privacy settings equal greater data collection, which means more information to sell on to other companies. By putting privacy as low as possible by default, and by ensuring that changing this is not a simple process, companies like Strava get to have their cake and eat it. The very fact that Strava has now offered to change how its privacy settings work should not be read as a company owning up to its mistakes: it’s a PR move and volte-face as part of an attempt to cover up its own shortcomings.

Designs are not just confusing because of incompetent programmers – in fact, the opposite can be quite true. For too long Big Tech has gotten away with hoovering up Big Data, before belated fixes which do little to help with all the information collected. This is only going to get worse as we head further into the age of the Internet of Things, where privacy policies will be increasingly obscure, and opting out of a small, screenless device will be practically impossible. Whether its consumer pressure or a governmental crackdown (such as a heavy enforcement of the GDPR on the most egregious offenders), the Strava story is just another case of Big Tech’s disregard for all of us except in how much our data sells for.

GDPR: five guides for the five months to go

Reading Time: 2 minutes

The General Data Protection Regulations (GDPR) kicks in on May 25th, and promise to be one of the most comprehensive shake-ups of how data is handled in history. It’s not just European companies which will be affected: anyone hoping to do business with the European Union and Britain will have to ensure that they are up to scratch. That’s a big boon for customers, who will have access over who gets access to their personal information, but a massive wall for companies who have often played it fast and loose when it comes down to security.

Our Insight piece last month picked up that guides remained the top trending terms with regards to GDPR – so for this blog , we’ve come up with some of the best writing on GDPR – for companies and for customers – showing how businesses can stay within the lines.

Europe’s data rule shake-up: How companies are dealing with it – The Financial Times

A very clear piece, with some case studies of how companies affected by GDPR will have to change their business practices to stay compliant. It also includes an important section on how GDPR shifts the barriers on the right to be forgotten. Where it used to be the responsibility of data controllers to ensure data privacy, data processors (including big companies like Microsoft and Amazon who host data for other businesses) will now be on the spot to deal with the issue: a potentially Herculean task depending on how well their data is kept.

How the EU’s latest data privacy laws impact the UK’s online universe: Tips to prepare your website for GDPR – The Drum

A great step-by-step guide to how to GDPR-proof a website, with a breakdown of potential areas where sites can fall out of compliance. It’s simple, but makes clear how easy it is to fall foul of the new laws without firm preparation.

Rights of Individuals under the GDPR – The National Law Review

A worthwhile read for both users curious about their new rights, and companies who will have to ensure that they are met to avoid hefty fines. These include the right to access any data held on them by an organisation, the right to withdraw consent at any time during the data processing or collection period, and the right to judicial remedy against data controllers and processors.

How Identity Data is turning toxic for big companies – Which 50

Less of a guide than the other pieces but a good read nonetheless for those keen to understand how the information ecology is fundamentally shifting. It points to the increasingly high number of annual breaches affecting large companies – and the fact that the fines levied against them under GDPR will make storing so much data so poorly potentially cost-ineffective.

General Data Protection Regulation (GDPR) FAQs for charities – Information Commissioner’s Office 

A handy piece for charities and small businesses looking to stay compliant, right from the horse’s mouth, including links to the ICO’s self-assessment page and other tools and guides to ensure that businesses don’t stray beyond the lines.

The GDPR present a challenge to existing norms, and businesses will have to step up to stay in check. But they also present an opportunity for ethical data processing, and a greater bulwark against the breaches which seem to plague the tech industry: a vital step, at a time when big tech stands on the brink of moving forwards or falling into the way of old monopolies.

Netflix’s Tweet May Have Been Made Up, But That Shouldn’t Make us Much Happier

Reading Time: 2 minutes

The tweet was meant in good humour undoubtedly: a little post by Netflix, claiming 53 people watched A Christmas Prince 18 days in a row. A light hearted jibe, in the vein of banter so heavily mined by Nando’s. That figure, as some commentators suggested, may well even have been drawn from thin air – a symbolic number, if you like.

And yet the tweet inadvertently underlined an uncomfortable truth about both big data collection offered by services like Netflix or Spotify, and the power which all that information gives algorithms. Whether or not the number is true, Netflix knows a lot about you.

By now, in the wake of Snowden and Wikileaks, you’d be hard pressed to find a citizen in any democracy who didn’t have some inkling of public surveillance.
Yet in some ways, the equally pervasive work of our entertainment apps goes unnoticed.
Perhaps it’s because most of the time, it doesn’t go out of its way to draw our attention to its specificity or scope. The ‘magic sauce’ of recommendations from Spotify is not merely their accuracy, but equally their opacity. Pull back the curtain and instead of the Wizard, you find algorithms and reams and reams of data. Whilst a black box may not satisfy the more paranoid, it offers consumers space to insert a more positive image.
Indeed, as Netflix’s faux pas proved, drawing people’s attention to data collection processes which Hoover up personal (if not private, or strictly speaking sensitive) information is the best way to convince them they’re in the Panopticon.

Of course, there’s nothing to suggest Netflix has weaponised this data in any particular way, beyond recommendations and somewhat unfunny jokes. Even assuming that 53 people really did watch one movie once a day for nigh on three weeks, there’s still the question of what level of identity is available. Are people’s whole life details on offer for any employee to see and laugh at? It would seem unlikely. Far more probable would be data in aggregate – details which are anonymised in essence.

The best outcome to the outrage surrounding the tweet is not to pour on more fuel in social media moral panics, but to use it as a teachable moment. Regardless of how nefarious they are, the amount of information gleaned through entertainment platforms we use daily is immense – something which we as consumers on the other side can forget.

Secondly, it’s important to understand why, to big data analyst is, personal details are less key. Data in combination with other datasets allows them to discover details about a user which would be impossible to glean previously.

Finally, it’s key to acknowledge the centrality of algorithms. Not terrifying cybernetic creatures, they are the lifeblood of so muchblf what we do. Granted, algorithms are by no means neutral – think about risks in police algorithms and sentencing – but they can serve less dubious purposes too.

Big tech is most dangerous when we understand it less. We should be grateful for Netflix’s quite clear blunder: it offers an opportunity for just taking it.