Facial recognition firm faces possible £17m privacy fine

Getty Images A stock illustration of facial recognitionGetty Images

An Australian firm which claims to have a database of more than 10 billion facial images is facing a potential £17m fine over its handling of personal data in the UK.

The Information Commissioner's Office said it had significant concerns about Clearview AI, whose facial recognition software is used by police forces.

It has told the firm to stop processing UK personal data and delete any it has.

Clearview said the regulator's claims were "factually and legally incorrect".

The company - which has been invited to make representations - said it was considering an appeal and "further action".

It has already been found to have broken Australian privacy law but is seeking a review of that ruling.

'Google search for faces'

Clearview AI's system allows a user - for example, a police officer seeking to identify a suspect - to upload a photo of a face and find matches in a database of billions of images it has collected from the internet and social media.

The system then provides links to where matching images appeared online.

The firm has promoted its service to police as resembling a "Google search for faces".

But in a statement, the UK's Information Commissioner said that Clearview's database was likely to include "a substantial number of people from the UK" whose data may have been gathered without people's knowledge.

The firm's services are understood to have been trialled by a number of UK law enforcement agencies, but that was discontinued and Clearview AI does not have any UK customers.

The ICO said its "preliminary view" was that the firm appeared to have failed to comply with UK data protection laws by:

  • Failing to process the information of UK citizens fairly
  • Failing to have a process in place to stop the data being retained indefinitely
  • Failing to have a lawful reason for collecting the information
  • And failing to inform people in the UK about what is happening to their data.

The UK Information Commissioner, Elizabeth Denham, said: "I have significant concerns that personal data was processed in a way that nobody in the UK will have expected.

"UK data protection legislation does not stop the effective use of technology to fight crime. But to enjoy public trust and confidence in their products, technology providers must ensure people's legal protections are respected and complied with."

The decision is provisional and the ICO said any representations by Clearview AI will be carefully considered before a final ruling is made in the middle of next year.

'Best interests of the UK'

Hoan Ton-That, chief executive of Clearview AI, said: "I am deeply disappointed that the UK Information Commissioner has misinterpreted my technology and intentions.

"My company and I have acted in the best interests of the UK and their people by assisting law enforcement in solving heinous crimes against children, seniors, and other victims of unscrupulous acts.... We collect only public data from the open internet and comply with all standards of privacy and law."

There are some signs that big tech companies are growing wary of face recognition.

Earlier in November, Facebook announced that it would no longer use facial recognition software to identify faces in photographs and videos.

But online tools, and search engines, using facial recognition technology continue to operate, privacy campaigners warn.