Police and security services are making ever more use of facial recognition technology, which can identify individuals via CCTV by scanning an individual's face and matching it to images held in a database.

No dedicated laws in Australia for facial recognition technology

Facial recognition technology (FRT) can be useful, but Australia does not have clear and dedicated laws regarding the use of this technology.

Privacy laws currently do not prevent people's faces being scanned and that biometric data being checked against a database, even when the person is doing something like shopping, sitting in a hotel having a beer or protesting in a demonstration.

Application of facial recognition technology

Police argue facial recognition technology helps them pick out suspects from a crowd and enables them to identify people captured on video at a crime scene. (Please see Facial Recognition, NSW Police Force, February 2023.)

Facial recognition technology is also used for unlocking smartphones, organising photos, monitoring by employers and scanning passports as we go through identity verification at airports.

Just recently it was found that three major retail chains were using facial recognition on their customers. The retailers said the technology was being used to identify "persons of interest" such as shoplifters, and that signs were posted at the entrance to the stores saying the technology was being used. (Please see Renewed calls for national guidelines on using facial recognition after CHOICE investigation, ABC News, 16 June 2022.)

Questions regarding accuracy and potential for misuse

Most customers of these stores had no idea their faces were being scanned. Privacy questions arise if the images of good customers are stored and sold to other businesses, or those of bad customers are passed on to external security services.

In 2022 a survey of more than 1000 people conducted by CHOICE found 76% of respondents agreed that "regulation is needed to protect consumers from harms caused by facial recognition use in retail settings". (Please see Push for new law to regulate facial recognition technology in Australia.)

There are also doubts about the accuracy of FRT, with some reports that the technology makes mistakes with people of colour and with women. In 2019 a London police operation using FRT wrongly identified 96 per cent of people scanned as suspects.

Push for regulation of use of facial recognition technology

Privacy and surveillance laws are currently covered in separate NSW and federal legislation, but they do not specifically deal with the use of FRT. (Please see Facial recognition technology - towards a model law, UTS Human Technology Institute, September 2022.)

The Human Rights Law Centre has said that as these new surveillance technologies have grown, the law has not kept up with protection of privacy and human rights, such as freedom of assembly, freedom of expression and freedom of movement. (Please see Australia needs dedicated facial recognition technology law, Human Rights Law Centre, 27 September 2022.)

"Right now Australian governments and corporations are using these technologies in an unregulated landscape, with few specific safeguards or oversights," said the HRLC's senior lawyer Kieran Pender.

In 2021 the Office of the Australian Information Commissioner found that the US company Clearview AI had breached privacy laws by taking Australians' biometric data from the internet without consent and using FRT to reveal identities. (For more information, please see Crackdown on facial recognition on social media.)

The use of FRT by businesses, police and security services should have conditions imposed by new laws to protect the right to privacy, as well as giving the public and human rights advocates the right to challenge FRT.

John Gooley
Criminal law
Stacks Collins Thompson

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.