Company attributes
Other attributes
Clearview AI is a developer of an artificial intelligence-based research tool designed to help law enforcement agencies identify perpetrators and victims of crimes. The company's platform uses facial-recognition algorithms and a database of more than 20 billion facial images sourced from public-only web sources, such as news media, mugshot websites, public social media, and other open sources. Clearview AI's tools are used to help law enforcement track criminals, exonerate the innocent, and identify victims of crimes.
The company was founded in 2017 by Hoan Ton-That and is headquartered in New York City. The platform is built to serve federal, state, tribal, and local law enforcement agencies and has been used by the following agencies:
- Fairfield Athens Major Crimes Unit Task Force
- The Georgia Police Department
- The Milton Police Department
- The Montgomery Country Texas Constable's Office
- The Anderson County Sheriff's Office
Clearview AI's platform is an artificial intelligence and computer vision platform enabling facial recognition for law enforcement agencies. The company suggests the platform can be used by law agencies for lead generation and public safety. Further, it is built with compliance features for increased oversight, accountability, and transparency within those law enforcement agencies, with other software tools intended to improve its user-friendliness, such as dashboards, reporting, and metrics tools.
The platform includes a database of over 20 billion facial images, which law enforcement agencies can use to receive leads that, when supported by other evidence, can help accurately identify suspects, persons of interest, and victims in order to help solve and potentially prevent crimes.
The company states its facial-recognition algorithms are greater than 99 percent accurate across all demographics through the database. The algorithms are intended to be weighted so the platform is not biased against prior offenders. This technology is also capable of being used on cold cases to generate new leads, insights, and associations that may have otherwise been missed by other law enforcement strategies or databases.
In May 2022, Clearview AI agreed to restrict US sales of the company's facial-recognition software to law enforcement agencies. This settlement came after a two-year lawsuit brought by the American Civil Liberties Union and other groups over alleged violations of an Illinois digital privacy law. As part of this, Clearview AI also agreed to not make the company's database available to Illinois state government and local police departments for five years.
These lawsuits against Clearview came due to privacy concerns from Chicago-based Mujeres Latinas en Accion, which works with survivors of gender-based violence and was a plaintiff with the ACLU. The group was concerned that Clearview AI's database could be used by stalkers, ex-partners, or other predators to track an individual's whereabouts or other social activity.
In February 2021, it was determined that Clearview AI, which provided the company's platform and publicly sourced database of images to the RCMP and Toronto Police Service, had violated federal and provincial private-sector laws by scraping those images from the internet without permission.
In the original joint investigation by the Office of the Privacy Commissioner of Canada, including its counterparts of the Commission d’accès à l’information du Québec, the Information and Privacy Commissioner for British Columbia, and the Information and Privacy Commissioner of Alberta, put recommendations in place for the following requirements of Clearview AI:
- Stop offering its facial-recognition services that have been the subject of the investigation in the three provinces above mentioned
- Stop collecting, using, and disclosing images of people in the three provinces without consent
- Delete images and biometric facial arrays collected without consent from individuals in the three provinces
In 2022, the Canadian government and related agencies ordered Clearview AI to follow the above recommendations, after it had been alleged that, as of December 2021, Clearview had not stopped processing or deleted the images and biometric information of Canadians as had been recommended in July 2020. Clearview AI had previously advised the commissioners that it complied with the first recommendation of July 2020. The order in 2022 allows Clearview AI to seek judicial review and allow courts to reconsider and overturn the order, and if the order is not overturned, Clearview AI could be subject to monetary penalties for non-compliance.
One of the complications of these orders has been the use of Clearview AI on the part of the Toronto Police Service. It was discovered that between October 2019 and February 2020, officers uploaded more than 2,800 images to the company's database and admitted officers used the service in an approximate eighty-four criminal investigations during this time. The later orders regarding Clearview AI's software would put the cases in jeopardy, and they could be subject to re-trial.
On March 13, 2022, it was revealed that Clearview AI's facial-recognition technology was offered to the Ukrainian defense ministry during the Ukraine-Russia conflict. Clearview AI suggested the ministry could use the software to uncover Russian assailants and combat misinformation and identify the deceased. As part of the deal, Ukraine was said to be offered free access to Clearview AI's facial-recognition software, which would allow authorities to vet people of interest at checkpoints, among other uses.