In West Lafayette, a ban on facial recognition technology failed this past December, but advocates say they plan to launch a new attempt at the ban this week.
West Lafayette Councilmember David Sanders said he will re-introduce a ban on the technology at the council’s meeting Monday night. In an effort to improve the odds for the ordinance’s passage after last year’s veto, Sanders said he will include exceptions allowing police to use the technology in the case of violent crimes.
At issue is whether there is adequate oversight of the controversial technology - and the software companies who contract with the state to provide it.
Facial recognition technology uses computer programs to compare individuals caught on video or camera to a database of faces. Law enforcement agencies say it’s an important tool for investigations - but privacy advocates say databases of photos can include pictures taken from social media without a person's knowledge or consent.
In Michigan, driver’s license photos are automatically shared with the state police. Misidentifications are rare, but they have led to wrongful arrests.
In December, West Lafayette’s city council voted against pushing through a ban on facial recognition technology - after the city’s mayor announced a veto.
During the hearing on the ordinance, councilmember David Sanders asked West Lafayette Police Chief Troy Harris which company the department used to conduct facial recognition searches - asking specifically if he had heard of one major provider, Clearview AI.
“Clearview? Never heard of it,” Chief Harris responded.
Clearview AI is among the most controversial companies providing facial recognition technology to law enforcement. The company has come under fire for its practice of pulling photos from places like Facebook without user consent. Those photos are then plugged into a database that is used to identify suspects.
Numerous law enforcement agencies across the country have used Clearview’s software - and privacy advocates worry there hasn’t been adequate oversight of the company, the technology, or its use.
Jameson Spivak is an associate at the Center on Privacy and Technology at Georgetown Law, a think tank that looks at surveillance and privacy law. He said traditional facial recognition databases involve one-to-one matching - a photo of a suspect is compared with a mugshot or driver's license photo.
“In the case of Clearview AI, which is really unprecedented, is that this database includes faces they’ve scraped from the internet,” he said. “So basically if your photo is on the internet and it’s been identified, you can potentially be in this database. Probably the vast majority of people in this database have no idea that they are in it.”
Spivak said the Center on Privacy supports a moratorium on the use of facial recognition technology.
“In most cases, this technology was rolled out without really anybody knowing about it in the public,” he said. “A lot of times the elected officials don’t even know that it’s being used, because most law enforcement agencies are getting this technology not through the city, the state, or the county budget but from the federal government or non-profit police associations. Basically, the city council might have not even known it was being used.”
In West Lafayette, police officials said facial recognition queries are sent to either the Indiana Bureau of Motor Vehicles or the Indiana Intelligence Fusion Center. But Lt. of Investigations Jonathan Eager said that facial recognition technology is “rarely” used.
“We normally don't get a decent photo of a suspect in an investigation, so it ends up being a resource we don't use,” he said.
The Indiana Bureau of Motor Vehicles did not respond to WBAA’s request for comment.
Captain Ron Galaviz is the Chief Public Information Officer for the Indiana State Police. He said law enforcement agencies across the state send facial recognition requests to the Indiana Intelligence Fusion Center. According to Galaviz, photos sent there for a facial recognition search need to be “legally obtained.”
“We have a policy on our website that is open for public view,” he said. “At the end of the day, we are very cognizant and want to be very respectful of the rights of those people who are moving around out there. We want to operate within the boundaries of making sure people’s rights aren’t violated.”
The state’s policy on facial recognition use outlines that the technology may be used on an image if “reasonable suspicion exists” that the subject is involved with or has knowledge of “possible criminal or terrorist activity.”
Galaviz said requests require a “criminal nexus” to be worth pursuing.
“So again, going to a protest or a demonstration, a gathering where people have a right to be—if there is no criminal nexus to that, the request won’t be processed,” he said.
Galaviz also underlined that facial recognition matches are not enough to convict someone of a crime. The Fusion Center’s policy outlines, in all caps, that results of facial recognition searches are “investigative leads… NOT TO BE CONSIDERED A POSITIVE IDENTIFICATION OF ANY SUBJECT.”
“When we’re talking about using this kind of technology it definitely behooved the Fusion Center to enact a detailed policy step by step by step— what are the parameters and the boundaries by which this technology can be used,” Galaviz said.
But privacy advocates like Spivak worry that policies guiding law enforcement use of the technology aren’t enough to hold police accountable.
“Law enforcement everywhere says ‘we only use this for an investigative lead,’” he said. “But there’s nothing holding them to that.”
When first asked whether the Fusion Center utilized Clearview AI, Galaviz noted that the facial recognition policy only mentions Vigilant Solutions - a separate facial recognition technology provider. Privacy advocates say Vigilant Solutions has a more established brand as a license plate identification software company and, importantly, had submitted to federal accuracy testing - something Clearview did not do until just last year.
Within the Indiana Intelligence Fusion Center’s policy documents, the only facial recognition provider mentioned is Vigilant Solutions, but a 2021 Buzzfeed investigation found the state police had queried Clearview over 5,000 times during a period starting in 2018 and ending in February of 2020. The Fusion Center privacy policy is dated June 1, 2019.
And, according to Clearview itself, the Indiana State Police were the company's “first paying customer.”
When asked when the state of Indiana might have made the switch from Vigilant to Clearview, Galaviz responded via email that “both platforms are being used as a means of checks and balances.”
A FOIA request made by WBAA found that in 2020 and 2021, the Indiana State Police utilized Vigilant Solutions 373 times. Clearview AI, according to the FOIA request, was used 3,067 times in that same time period - for roughly eight times as many searches.
When asked to explain why Clearview AI is not mentioned in the state’s policy documents despite being used more, Galaviz said only that “Clearview provides a broader data set.” When pushed for further response, Galaviz said “I refer back to what I have previously said about the use of these platforms, which includes acknowledgment of the use of both platforms.”
Spivak, with Georgetown Law, said he could only speculate as to why Clearview is being used without showing up in state policy documents.
“One of the things I would speculate is that the police are hiding its use is because Clearview became kind of a toxic name,” he said. “Within the past few years, there have been a number of media reports finding that in developing their system Clearview broke the terms of service of Facebook, YouTube, a bunch of social media platforms because they scraped images from these websites.”
And Spivak said the lack of clarity makes it harder to ensure police are following their own policies.
“This leads to confusion and less transparency and then ultimately less accountability,” he said.
The situation in West Lafayette is not unique. According to Spivak, there has been little effort at the federal level to provide oversight and accountability for facial recognition technology - pushing local governments to take up the issue themselves.
“In this vacuum where the federal government isn’t doing anything the states and cities are stepping up to pass things,” he said. “Because the federal government isn’t doing anything about it, and because maybe even the states aren't doing anything about it, local activists, local politicians stepped up to try and do something.”
Across the country, Spivak said over twenty cities and two states have passed moratoriums on the use of facial recognition technology - although those moratoriums will sometimes include carve-outs for violent crimes.
West Lafayette Councilmember David Sanders said he’s hopeful changes to the ordinance will help it pass this time around. He said he continues to worry about police use of the technology.
“The fact that there’s so much interaction with Clearview AI, and yet it's not present on their public documents—that says something to me,” he said. “It says there is something to hide and it says they clearly don’t think the public should know they are using Clearview AI.”