Britain’s data protection watchdog has confirmed a fine for controversial facial recognition company Clearview AI – and today announced a fine of just over £7.5m for a series of breaches of local privacy laws.
The watchdog has also issued an enforcement statement ordering Clearview to stop obtaining and using the personal information of UK residents that is publicly available on the internet; and telling him to remove UK residents’ information from his systems.
The US company has amassed a database of more than 20 billion facial photos by scraping data from the public internet, such as from social media services, to create an online database that it uses to power an AI-based identity reconciliation service. selling it to entities such as law enforcement. The problem is that Clearview has never asked individuals if it can use their selfies for that. And it has been found in many countries to violate privacy laws.
In a statement to enforcement today, UK Information Commissioner John Edwards said:
“Clearview AI Inc has collected multiple images of people around the world, including in the UK, from various websites and social media platforms, creating a database of over 20 billion images. The company not only enables identification of those people, but effectively monitors their behavior and offers it as a commercial service. That is unacceptable. That is why we have taken steps to protect people in the UK by both fining the company and issuing an enforcement order.
“People expect their personal information to be respected wherever in the world their data is used. That is why international companies need international enforcement. Working with colleagues around the world allowed us to take this action and protect people from such intrusive activities.
“This international collaboration is essential to protect people’s privacy rights in 2022. That means working with regulators in other countries, as we have done in this case with our Australian colleagues. And it means working with regulators in Europe, which is why I’m meeting them this week in Brussels so we can work together to tackle global privacy violations.”
“Given the large number of internet and social media users in the UK, Clearview AI’s database is likely to contain a significant amount of data from UK residents, which was collected without their knowledge,” the British watchdog also wrote in a statement. press release†
“While Clearview AI no longer offers its services to UK organizations, the company has customers in other countries, so the company still uses personal data from UK residents,” it added.
The Information Commissioner’s Office (ICO) warned Clearview it could issue a financial fine last fall, when it also ordered the US-based company to stop processing data on UK citizens and to delete all data in its possession. remove.
It confirmed those preliminary findings in today’s formal enforcement — finding Clearview in violation of a range of legal requirements.
Specifically, the ICO said Clearview had no legal basis for collecting information from people; has not used individuals’ information in a fair and transparent manner, as individuals are not aware or cannot reasonably expect that their personal information will be used for the purpose for which Clearview is using it; failed to have a process to prevent the data from being kept indefinitely; did not meet the higher data protection standards required for biometric data (known as ‘special category data’ under the EU General Data Protection Regulation and the UK GDPR); and, in a further breach, Clearview asked for additional personal information, including photos, when members of the public asked if they were on the database, hindering their data access rights. “This may have had a discouraging effect on individuals wishing to object to the collection and use of their data,” the ICO noted.
Clearview was approached for comment on the UK sanction.
One thing to note is that the fine is significantly less than the £17m+ the ICO announced last fall in its preliminary order against Clearview. We have asked the regulator about the reduction, although the exact amount Clearview will be fined may be irrelevant if it refuses to pay.
International regulators have limited resources to enforce privacy orders against foreign entities if they choose not to cooperate, and if they do not have a local representative, an order can be enforced.
Still, at the very least, such sanctions limit Clearview’s ability to expand internationally — as all local offices are directly accountable to regulators in those markets.
The British penalty is certainly not Clearview’s first international sanction. The UK investigation was a joint proceeding with the Australian privacy watchdog, which also ordered the company to stop processing citizen data and delete all information it had last year. France and Canada have also sanctioned the company. While Italian data protection regulator Clearview fined EUR 20 million in March.
On its grounds, Clearview agreed to settle a 2020 lawsuit earlier this month from the American Civil Liberties Union — which had accused it of violating an Illinois law (the Biometric Information Privacy Act; BIPA) prohibiting the use of biometrics. prohibits data from individuals without consent.
The terms of the settlement appear to prohibit Clearview from selling its facial recognition database or giving access to its facial recognition database to private companies and individuals nationally in the U.S., although an exception was included for government contractors (but with a five-year ban on the use). to contractors within Illinois itself).
The settlement also requires Clearview to maintain an opt-out system to allow Illinois residents to block their portrayal of facial search results — and to end a controversial practice of providing police officers with free trials like those. individuals do not receive approval through their departments to test the software.
Clearview, however, saw it as a win — suggesting it would respond by selling its algorithm to private companies in the US, rather than monetizing its database of scraped selfies.