Solutions The best route to security compliance
Platform A powerful suite of ISMS features
Resources Everything you need to know
Knowledge Base Learn more about infosec
Company Security and customers first
Back to Knowledge & Insights

Clearview AI and the ethics of data scraping for facial recognition

At Hicomply, we have been following the Clearview AI lawsuit story with some interest. Due to a loophole in GDPR laws, the company avoided a fine last week despite allegedly scraping and storing photo data from millions of UK residents via social media accounts.

What is Clearview AI?

Clearview AI is a facial recognition company, predominantly used by government and law enforcement organisations. The facial recognition software relies on a database of over 30 billion images of people’s faces and data, which it has collected from publicly available information on the internet and social media platforms.

The company allows its customers, including the police, to upload an image of a person to the company’s app, which is then checked for a match against all the images in the database. The app then provides a list of images that have similar characteristics with the photo provided by the customer, with a link to the websites from where those images came from.

Their technology has been called a "Shazam for people that could end privacy as we know it." Given the high number of UK internet and social media users, Clearview AI’s database is likely to include a substantial amount of data from UK residents, which has been gathered without their knowledge.

Although Clearview AI no longer offers its services to UK organisations, the company isn’t banned and has customers in other countries, so the company’s facial recognition software is still using personal data of UK residents.

What was the original Clearview AI fine?

In May 2022, Clearview was fined £7.5m by the Information Commissioner's Office and ordered to delete UK citizens’ data. The controversy stems from the fact that internet users were not informed that their images were being collected or used in this way; the company used the fact that these social profiles and posts are “public” to use personal images. It’s worth saying this included business social media apps such as LinkedIn as well as personal apps like Facebook and Twitter.

The UK ICO stated it believed that Clearview AI breached UK data protection laws by:

  • Failing to use the information of people in the UK in a way that is fair and transparent, given that individuals are not made aware or would not reasonably expect their personal data to be used in this way;
  • Failing to have a lawful reason for collecting people’s information;
  • Failing to have a process in place to stop the data being retained indefinitely;
  • Failing to meet the higher data protection standards required for biometric data (classed as ‘special category data’ under the GDPR and UK GDPR);
  • Asking for additional personal information, including photos, when asked by members of the public if they are on their database. This may have acted as a disincentive to individuals who wish to object to their data being collected and used.

What was the GDPR loophole that enabled the fine to be overturned?

Judges ruled that Clearview AI broke no law when it sold its database to police forces because the buyers were non-UK and therefore outside of jurisdiction.

The £7.5m fine and deletion order by the ICO in 2022 was overturned last week. The London tribunal on October 17th backed that the fine be struck down because Clearview only advertised its database for sale to law enforcement agencies based outside the UK and EU. Judges said the GDPR data law therefore did not apply because there is an exemption for foreign law enforcement

Clearview’s general counsel Jack Mulcaire said: “We are pleased with the tribunal’s decision to reverse the UK ICO’s unlawful order against Clearview AI.” An ICO spokesman said it would “carefully consider [its] next steps”. Campaign group Privacy International described the ruling as “nonsensical” and “extremely puzzling”. Lucie Audibert, a lawyer for Privacy International, said: “It’s saying to companies, ‘hey, you can do whatever the hell you want with UK residents’ data as long as you don’t sell it to the UK government’.”

Why is the Clearview AI fine being overturned concerning for UK residents’ privacy?

This is case study of what has been termed “surveillance capitalism“. The wider privacy concern is about what non-UK companies are doing with UK residents’ personal and business data, and how they can evade UK laws due to limits on UK law’s international jurisdiction. This is a landmark decision that should make all 68 million UK business and residents carefully consider any data they put into software or systems hosted or owned in the USA, or indeed any other country outside of the UK.

Can loopholes in legislation enable non-EU organisations to avoid the privacy laws set up to prevent such actions?

We’re also seeing heightened scrutiny of data privacy laws amid significant advances in artificial intelligence (AI). AI models powering services such as ChatGPT feed on huge volumes of text and images, almost all of which are scraped from publicly accessible websites and the social media platforms.

There is a counter argument that this is using data UK residents have “made public”, and this type of product is good for law enforcement in an ever-fractious world. Is this really the case? Can tools like this be kept out of hands of “bad actors”? For most people, it’s happening without their knowledge or consent.

How many UK residents’ photos does Clearview have for sale on its platform?

Clearview says on its website that it has assembled a database of more than 30 billion images of people’s faces, along with identifying details.

In its privacy policy, the company says “Publicly available photos and information derived from them: As part of Clearview’s normal business operations, it collects photos that are publicly available on the internet. The photos may contain metadata which may be collected by Clearview due to it being contained in the photos, and information derived from the facial appearance of individuals in the photos.”

Hicomply researched the main platforms in the UK from which data has been sourced, and estimates at least 45 million UK residents have photos online, with around 40 million that don’t have fully private accounts where some photos are visible:

  • LinkedIn - In June 2023, there were around 38.1 million LinkedIn users in the United Kingdom, up from 36.9 million in the previous month. LinkedIn user numbers have steadily grown throughout 2022. Furthermore, the employment-oriented network has witnessed an 11 percent growth in users since June 2022.
  • Facebook- out of the total estimated UK population of 67.9 million, approximately 44.84 million people are active Facebook users - approximately 66% of the total UK population.

So, who is using UK residents’ social media photos in Clearview AI’s product?

A list of Clearview AI's customers was leaked in 2020. It revealed that the company had 2,200 clients spread across 27 countries, including Saudi Arabia, the UAE and India.

The list allegedly included “law enforcement departments, government agencies, and companies,” although some clients only trialled the service for 30 days. At the time, a spokesman for Clearview said its app had “built-in safeguards to ensure these trained professionals only use it for its intended purpose: To help identify the perpetrators and victims of crimes.” Last May, Clearview settled a US class action lawsuit, agreeing to stop advertising its service to consumers and private companies.

Final thought

Is it time for the UK to take a more protective view, both as individuals and as companies and without our legal framework? If nothing else, it will ensure UK PLC is not having its inherent value scraped away without its knowledge. Maybe we should all begin by copyrighting our own images online? Although that may just be a start.

With the UK Government now starting to take AI seriously and look at adopting the technology in key sectors such as healthcare, there has to be awareness that the data ownership rights and onward commercialisation has to be looked at properly, and that the power of the UK Courts is severely restricted in cases involving international jurisdiction.

More Insights

ISO27001
ISMS Risk Register
ISO27001
ISMS Implementation
ISO27001
Defining ISMS Objectives