Companies called out for bad behavior often misdirect the conversation by saying – "Yes, but we do good things too!"

The worst defense of bad behavior ventured by any company, in my opinion, is "Yes we are loathsome, but we create jobs." Jobs are good, don't get me wrong.

But every enterprise creates jobs. Companies working with toxic chemicals create jobs that kill employees. Companies that ruin the environment create jobs that steal clean air and water from our families. Pimps create jobs and exploit the most vulnerable. Drug kingpins create jobs and murder children.

If you can't create jobs without enormous social cost, then don't pretend you are good for my community.

Enter Clearview AI, the recently outed facial recognition database purveyor that downloads billions of pictures from the web and other places, selling search access to that data to whoever wants it. Stalking an old girlfriend? Use Clearview to find her. Creating a suspect pool for a local crime? Clearview can give you faces to arrest. The New York Times described the company's business model in a front page story as the end of privacy.

Clearview touts its database's ability to identify terrorists and child trafficking victims, a defense against the claims that it has cheapened or eliminated important elements of privacy for all people in its databases. Creating jobs and doing good works, no doubt, but at what cost? However, the New York Times recently reported that Clearview allowed investors and friends of the company to use an app to simply look up information about people they saw on a day-to-day basis – snap a quick picture and find out details about someone who interests you. A billionaire grocery chain owner used it to surveille his daughter's date.

I have written in support of facial recognition systems for security purposes, noting that we all offer our faces to the world every day as proof of our identity. But the privacy problem raised by facial recognition programs comes not from the face identification itself, but from the relatively new ability to scan billions of images for examples of a particular face, and then place that person at locations and times where the person expected to be simply "a face in the crowd." These database searches ruin expectation of security by obscurity that we all count on in our daily lives.

"The company has credentialed users at the FBI, Customs and Border Protection (CBP), Interpol, and hundreds of local police departments. In doing so, Clearview has taken a flood-the-zone approach to seeking out new clients, providing access not just to organizations, but to individuals within those organizations — sometimes with little or no oversight or awareness from their own management." This is according to Buzzfeed News, which also wrote, "Clearview AI has also been aggressively pursuing clients in industries such as law, retail, banking, and gaming and pushing into international markets in Europe, South America, Asia Pacific, and the Middle East." The New York Times wrote that Clearview AI was pitching to white supremacists the unique ability to use Clearview's app and identify certain people of interest.

No laws directly regulate or limit how Clearview AI can spread your face to anyone willing to pay them, and the company itself does not seem to impose serious limitations on which people within its customer organizations can use the database and for what purposes it might be accessed. While Illinois, and other states, limit the use of biometric capture and use, they tend to exempt using pictures of people's faces. The Biometric Act in Illinois exempts photographs from the definition of biometric identifier.

Not to mention Clearview AI's cavalier attitude toward protecting data from thieves. After Clearview AI's last admitted data breach, coindesk.com article said, "Clearview's security protocols are untested, unregulated and now proven unreliable. The company houses three billion images to feed an AI-powered surveillance tool used by corporate and state actors; now its client list has been published, showing once more it can't be trusted to maintain user privacy." The company's lawyer told the Daily Beast that Clearview had suffered a major theft of data, but that "Unfortunately, data breaches are part of life in the 21st century." He was also quoted in the coindesk article as saying, "A right to privacy in your face has never happened in the law. That’s a new thing that people are making up now."

Clearview AI claims that it can't search for names, only by pictures. A California resident writer for Vice requested and received under the CCPA grant of rights the pictures of her held by Clearview. These pictures had been taken between 2004 and 2019 and had been obtained not just from social media, but from some strange sites as well, such as Insta Stalker. Her articled noted that, while some information, like your job or address, become stale over time, your picture never does.

It is difficult to tell whether Clearview AI has run a problematic course apparently free of ethical constraints because that is the nature of people who chose to make a business out of destroying individual privacy, or because it leaders are simply a deeply pernicious example of Silicon Valley culture of creating an interesting technology and then trying every way possible to make it into a self-sustaining enterprise. Many of the company's choices suggest that the lack of serious guard rails was a conscious decision.

It is my understanding from talking to technologists in this industry that such a database tool has been available for well over a decade, but companies like Google had decided that the societal costs were not worth the benefits the technology provided. That means that sooner or later, someone with less moral mooring would decide that the benefits could make money and damn the societal costs.

But if the fact that it produces jobs should not excuse bad behavior that makes the world a decidedly worse place, then the value of a cool application should not excuse it either.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.