On Thursday, September 12, 2019, the California legislature adopted a three-year moratorium on state and local law enforcement use of facial recognition through body-worn cameras. The bill, AB-1215, now heads to Governor Gavin Newsom’s desk and would come into effect on January 1, 2020, if signed.
The moratorium comes on the heels of initiatives by the cities of San Francisco, Oakland, and Somerville (a suburb of Boston) to ban the use of biometric surveillance by law enforcement. Proponents of these moratoria argue that ubiquitous facial recognition chills free speech in public places and has a disproportionate impact on women and people of color due to elevated risk of “false positive” identifications. The California legislature went further and found that facial recognition is the “functional equivalent of requiring every person to show a personal photo identification card at all times in violation of recognized constitutional rights.” To enforce the ban, AB-1215 establishes a private right of action for individuals to seek equitable or declaratory relief against law enforcement agencies that violate the moratorium.
California’s moratorium marks another step in the debate over biometric surveillance technologies more broadly in both the public and private sectors. Facial recognition and other biometric surveillance technologies hold some promise to make our lives easier and more efficient. For example, Delta Airlines has been experimenting with facial recognition to expedite international check-ins at airports around the United States. The company hopes that enhanced identity verification will alleviate security checkpoint delays and bottlenecks around ID checks.
But quick and efficient identification comes with risks—biometrics cannot be changed like a credit card number or login credentials. A fingerprint or facial template is effectively unalterable and becomes a unique signature for a person’s authenticated identity. As a result of the privacy risks inherent in unalterable, unique signatures, regulators are looking closely at facial recognition practices and are requiring robust privacy assessments or enhanced consent to authorize their use.
Recently, Sweden’s data protection authority issued its first fine under the GDPR (200,000 Swedish Krona) to a school district that had conducted a trial using facial recognition technology to track student attendance. Despite obtaining parental consent, the regulator found that the school district had not identified a legally adequate justification for collecting sensitive biometric data on students. Companies and governments considering the adoption of facial recognition technology should take this action to heart and carefully weigh its anticipated benefits against the heightened privacy risks associated with biometrics.
Facial recognition laws and regulations will continue to evolve as biometric surveillance techniques become more effective and ubiquitous. Stay tuned to PH Privacy in the coming months as we track the development of these laws and offer insights into how companies can navigate this complex and shifting regulatory landscape.
The content of this article does not constitute legal advice and should not be relied on in that way. Specific advice should be sought about your specific circumstances.