Several days ago I posted an article about Artificial Intelligence (AI). In this morning’s New York Times there is fascinating/disturbing article about facial recognition – something not foreign to many users of smart phones that use the technology to unlock phones and open software. But the article by Kashmir Hill is about a company, Clearview AI, that provides advanced facial recognition software to law enforcement agencies, private companies, and public concerns.
In the AI post of a few days I ago, I made the point that neural networks (AI) need processing power (easily provided these days) and lots of information in order to learn. One of the pitfalls is who is providing the information and is it unwittingly including the biases of the ones providing the information. As noted, the internet is a treasure trove of data – that you have provided in all the images that you posted on Facebook, Instagram, and the usual list of suspect. For some of us, it is minimal. For others, the pictures posted run into the thousands. Clearview AI has spent years scraping those pictures into a massive database that it makes available to whoever is willing to subscribe – primarily law enforcement.
One major concern is that facial-recognition technology might be too flawed for law enforcement to rely on. A federal agency called the National Institute of Standards and Technology (NIST) periodically tests the accuracy of facial-recognition algorithms voluntarily submitted by vendors; Clearview has never participated.
There are issues of first amendment rights, privacy rights; there are stories of founding that are unsettling; and there is uncertainty where this is all headed. While you might not post pictures online, our pictures are being taken all the time in public and private surveillance video – from street, to store security, to the Ring camera on your front door. It has the hallmarks of “big brother.” I am reminded of the expression: just because you’re paranoid doesn’t mean you are wrong.
Take a moment (actually many moments) to read Hill’s article.