Data viewpoint: Police, privacy and the private sector 

March, 2023 - Shoosmiths LLP

Living in a data democracy, we have got used to thinking that ‘data protection’ in the private sector Is separate from police and intelligence services activity.

But is this really true?

In Europe, data protection rules about the two spheres originated in different directives: ‘general’ processing and ‘law enforcement/intelligence services’ processing. In the UK, they were conveniently separated under different parts of the Data Protection Act 2018, and there was much careful consideration of whether processing fell under ‘Part 2’ (general, which covers processing by commercial entities) or ‘Parts 3 and 4’ (processing by law enforcement and intelligence services respectively).

As techniques for data gathering and analysis expand, it’s becoming increasingly clear that the world of government security functions, and the world of commerce, can’t continue to be separated like this. Why is this happening? At root, because large undifferentiated data flows contain material of interest to every sector of society. And what might have seemed an impossible task – filtering out activities and individuals of interest – is now made possible by sophisticated data analytics tools. 

So each sphere has huge potential to disrupt the other, with consequences good and bad.  It’s worth considering two developments here.

The first development we have observed over the last few years is that that perceived data overreach by security agencies has enormous potential to disrupt commerce, by threatening the adequacy of data transfer mechanisms which apply to data for commercial use. Long-standing protests (predominantly in Europe) about the mass surveillance by security agencies of data transferred to the US have effectively led to a breakdown in transfer mechanisms for commercial purposes. The effect can be indirect as well: the recent US/UK deal on Overseas Production Orders for accessing crime-related data has been challenged by the European Parliament’s elected representatives as possibly threatening the whole data transfer framework between the UK and Europe. The same data transfers could face another threat if the UK’s regime for bulk surveillance is judged not to be compliant with requirements set out by the European Court of Human Rights in its landmark 2021 Big Brother ruling. 

Somewhat surprisingly, national governments have (at least in theory) made considerable concessions in the face of the threat to commerce: notably in the new EU/US framework mentioned earlier, which provides some redress to individuals concerned about eavesdropping by security agencies. Big tech, which demands smooth international data flows, has perceived the threat to its business from rights-based legislators, and has ensured some brakes are put on security-related activities at a national level. Benign checks and balances, perhaps.

The second development is perhaps more alarming: the interesting new role played by the private sector in law enforcement. Crime and intelligence agencies - what we might call the ‘security agencies’ - overwhelmingly now rely on the private sector to develop tools to observe, predict and control disruptive behaviour.  

Take the activities of Clearview AI, the US company, which has developed a 20 billion-strong database of faces scraped from social media all around the globe, which can be used to identify individuals. Who is the main customer for this tool? Law enforcement in the US – and it’s apparently very useful.   

In the old days, we would have simply said that a company in the private sector can develop a tool but not use it for law enforcement purposes – while a government agency can purchase the tool and use it within the safeguards set up by its national laws. This distinction is now hopelessly complicated by the fact that the work of developing the tools relies on techniques of bulk surveillance to be carried out within the private sector – as with Clearview. It may even require a private sector company to process data gathered by a security agency, in order to develop it.  

This uncomfortable relationship is beginning to have some interesting fallout. Clearview has been fined under the general rules applying to data processing by regulators in Europe, but to no avail – with no commercial presence in Europe, the company denies jurisdiction. And what about that useful recent transatlantic accord designed to protect European personal data from the prying eyes of US security? No good, of course – not least because the agreement only covers the activities of the US intelligence services, not of the US companies developing tools for them to use. Though it would be interesting to see what would happen if a European citizen was charged as a result of information obtained by using the product in the US.

According to the UK regulator, the UK government had some involvement in testing the Clearview capability, but it has hastily stepped back following concerns (including from the UK data regulator) that the data had been unlawfully processed, at least from a European perspective.  But perhaps anticipating future problems, the UK government has taken a bold step in its proposed update to the GDPR: the draft Data Protection and Digital Information Bill – just released in its second version. Here, lawful bases for data processing which used to be squarely within the remit of security agencies have simply been gifted en masse to the private sector. So, a UK organisation similar to Clearview would perhaps have less difficulty in justifying its activities. It feels like a short step to allowing the private sector to take over what were previously considered to be the functions of security agencies.  

And when agencies can access multiple databases then we also face the parallel problem of mission creep. The latest Code of Practice for the new Law Enforcement Data Service in the UK, which sets out when crime agencies can access ‘centralised information about individuals, property and vehicles’ acknowledges that they are talking about more than strictly law enforcement purposes. ‘Protecting life’ and ‘preserving order’ are ends potentially far from anything to do with the commission of criminal offences. More blurring of those thin blue lines.

That said, the ongoing dance between commercial and security imperatives means that any move which threatens international commercial data flows by justifying removal of the UK’s adequacy finding will carry a high price: back to development one.

Of course, all this is happening while technology promotes a third development: every citizen becoming part of the security function. All of us are already free to act as surveillance operatives - at home with our CCTV cameras and smart-doorbells, and out-and-about, with our dashcams and mobile phones.  And before long, our very environment may be watching us, too:  the infrastructure required for autonomous vehicles may be picking up continuous streams of data from our car, including about what we and our passengers are up to. There are already uncomfortable conflicts between privacy and policing in the use of this type of data, and it looks as if the game is about to get a whole lot more complex. 

 



Link to article

MEMBER COMMENTS

WSG Member: Please login to add your comment.

dots