Should Clearview AI escape ICO punishment in the UK?
Alongside Cambridge Analytica, Clearview AI has emerged as probably the best-known maverick of data analytics based on recent reports. It has been fined by data protection regulators around the world, including in the UK, for developing what regulators consider unlawful facial recognition technology.
Everyone knew this fine was never going to be enforced. But not even the most die-hard data libertarian thought that it would be overturned by the UK’s own domestic courts. Yet this has just happened in the first-tier tribunal.
What is Clearview AI?
Clearview is a US company. It scrapes the global internet through fast, automated data collection to create a database of images of people, including us in the UK. In October 2022 it held around 20 billion images, and the database grows by 75 million images per day.
It’s a massive global facial recognition system. You can’t buy the database, but you can ask Clearview to run your pictures through the system and they will provide a series of possible matches together with an estimate of how accurate each one is. At the press of a button, it’s possible to take an unknown face and identify and track that person across the internet, throughout the world.
Clearview’s commercial activity
At first, Clearview offered this service to anyone prepared to pay. But this proved hard to stomach even in the US where data protection rules are famously inconsistent. So as part of a US privacy settlement, Clearview agreed from 2020 to offer the service only to law enforcement and national security organisations. It is now used by law enforcement customers around the world including in the US, Panama, Brazil, Mexico, and the Dominican Republic.
Given its extraordinary powers and unprecedented scale, the system has come under challenge for infringing data protection legislation. Last year, in the UK, the Information Commissioner’s Office (ICO) fined Clearview AI £7.5 million for a litany of breaches of the UK GDPR. This included failing to collect consent or have a lawful basis for processing the images.
A few days ago a first-tier tribunal determined that the ICO does not have jurisdiction over Clearview AI. In brief, this is because Clearview AI’s clients are, as of 2020, exclusively carrying out law enforcement or national security work outside the UK, which is beyond the scope of the UK GDPR.
This seems surprising given that Clearview is a private company. Why can they coat-tail onto exceptions carved out for law enforcement and national security? Why indeed. It’s part of the brave new world of private sector involvement in law enforcement which we flagged in March. By blurring the lines of who does what, activities can end up escaping regulation.
And so it has come to pass. Here’s what has happened, and why we think the decision should not be allowed to stand.
The decision in more detail
The reasoning of the decision is that the UK GDPR (like the EU GDPR) being founded on EU Union law does not cover the acts of foreign governments, which are outside European Union law. Because Clearview is at the moment only offering services to foreign governments they are not covered by the UK GDPR and the ICO has no jurisdiction.
The first problem to grapple with is the conflation of law enforcement and national security. Even if national security is outside remit, EU Union law does cover the data protection activities of law enforcement authorities. It has its own parallel EU Directive, the ‘Law Enforcement Directive’ which exists alongside the GDPR and the implementation of which is part of the UK Data Protection Act. So it’s hard to understand how law enforcement activities have been swept off the table on the basis that they are ‘outside Union law’.
There is hardly any analysis of these underlying concepts in the ruling. But even if a deeper dive found that non-domestic law enforcement is indeed out of scope, a second argument is harder to bat away. It’s true that UK the GDPR does not cover the activities of competent authorities when acting for law enforcement purposes. But crucially Clearview is not a competent authority, so its activities cannot be excluded from the scope of the GDPR.
This is a crucial point and it’s one underlined in the decision itself. The analysis in the ruling is that Clearview is a separate controller in respect of some of its activity (‘indexing’) and a joint controller with customers for other activity (‘matching’). So the judges have decided that Clearview is determining the means and purposes of at least some of the processing and must therefore be carrying out separate processing activity from that of any competent authority. It therefore cannot be carrying out either law enforcement processing (which can only be carried out by a controller which is a competent authority) or the ‘acts of foreign governments’ (as it is a separate controller which is not a government entity) and therefore must fall inescapably under the UK GDPR.
Where does this leave us?
To put it in non-legal terms, when the private sector is helping law enforcement by developing capability under its own steam, it must do so within the law which applies to the private sector, at least if it wants to carry out controller activity. The fact that the capability will later be used by a competent authority is a secondary issue and one which must be judged under the rules applicable to that separate activity.
This separation is one which the government knows about and is seeking to change in the UK under the Data Protection and Digital Information (No.2) Bill. But as the law stands today the distinction is clear and must in our view be respected for any company – anywhere in the world - caught by the GDPR while carrying out processing that relates to the monitoring the behaviour of individuals in the UK. Otherwise a company operating globally can simply game the system by offering governments and law enforcement a capability which it would be unlawful to develop within and for its own jurisdiction.
Things are playing out very differently in Europe. Under a recent EU ruling possible mission creep by the private sector was stamped on, with Meta begin firmly told that the prevention and detection of criminal offences cannot be a legitimate interest of a commercial controller.
Back in the UK, the tribunal was not invited to consider the Law Enforcement Directive at all, nor was it offered and made no analysis of the scope of Union law. We can only hope that a higher court gets a chance to rethink the position with the benefit of more evidence.
If it doesn’t, we may all be in trouble with the law.
Link to article