BCL partner Julian Hayes and legal assistant Andrew Watson's article examining the critical balance between human rights, privacy and the use of biometric technology by the police has been published in Police Professional.

Here's an extract from the article:

" "It is vital the Government works to empower police to use technology to keep the public safe while maintaining their trust."

With those words, the Home Secretary announced the recent appointment of Professor Fraser Sampson as England and Wales' first Biometrics Commissioner and Surveillance Camera Commissioner. Combining two previously distinct offices, the new Commissioner will be responsible for promoting the appropriate use of biometric data as well as the overt use of surveillance camera systems by relevant authorities.

It is fair to say the professor's tenure got off to a contentious start when, just two months into the job, he reportedly suggested that discretion rather than law should govern police use of facial recognition technology (FRT). While FRT arouses much controversy, it is just one form of a fast-growing range of technology available to law enforcement that harnesses the power of algorithms or artificial intelligence (AI) to achieve what, until recently, existed only in science fiction.

The Commissioner no doubt spoke for many people when he asked how, if certain technology was available, a policing body could responsibly not use it to prevent and detect crime and to keep people safe. However, such technological advances have developed without a dedicated legal and regulatory framework in place, raising serious concerns over privacy, fairness and human rights. Courts and legislatures are now beginning to grapple with the problem.

Algorithmic policing

Algorithmic policing technology falls into two broad categories: surveillance technology and predictive technology.

Surveillance technology automates the collection and analysis of data. Examples include facial recognition, social network analysis revealing connections between suspects, and more prosaically automated numberplate recognition systems.

Predictive technology uses data to forecast criminal activity before it occurs, allowing the police to intervene to apprehend suspects or prevent them from offending. Falling into this category are predictive mapping programmes such as PredPol, which identify crime 'hot spots', and individual risk-assessment programmes that seek to identify whether someone will re-offend. An example of this latter type is the Harm Assessment Risk Tool developed by Durham Constabulary to predict the likelihood that an individual will re-offend. Also falling into the category of predictive technology is emotion recognition, which analyses facial expressions to try to decode an individual's mood and intentions, and is to be trialled by Lincolnshire Police."

This article was originally published by Police Professional on 25/05/21. You can read the full version on their website.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.