“Over 2,400 Police Agencies” use Intrusive Clearview AI Facial Recognition System
First of all, I need to get this out of the way: Shame on you, Jason Calacanis. According to The Verge:
The interview was taped in May, but Calacanis didn’t release it until today. That’s because it took place the morning after George Floyd was killed by police. “Once the protests started in America, and we were watching these anti-racism protesters, we decided we might hold the interview because it didn’t feel like the right time,”
Seriously. Shame on you. You knew that 2,400+ police departments were using Clearview AI’s software and waited months to publish the interview admitting that.
Anyway, back to it. Clearview AI’s has a major selling point for police: Agencies upload a photo of a person to it, and the software runs facial recognition algorithms against a database of millions of faces and names.
Where does Clearview AI get this information? Freely-available content on the Internet, including sites like Facebook, Instagram, Linkedin, Flickr, Venmo and “millions of other websites.“
This is unlike Amazon’s Rekognition – which the company stopped selling to law enforcement earlier this year – in that Rekognition requires the user to upload their own pictures. Clearview AI comes with the entire data set built in.
Facial recognition software has already resulted in at least two arrests in recent months, and I’d bet that there are more that haven’t come to light.
Police use of this technology needs to be outright banned as an absolute violation of our privacy rights. The Attorney General of New Jersey has already spoken out against the software and banned agencies from using it.
Being in public, especially at a protected first-amendment activity, should not give law enforcement the right to track you. In the face of multi-million dollar companies and local, state & federal law enforcement with seemingly unlimited budgets, we need a real movement to expose use of these tools and fight back.
Oh, and again? Shame on you, Jason Calacanis.