Keywords: Hong Kong, biometric data,

Face recognition technology to help "tag" friends in photographs, fingerprint recognition to unlock smartphones, and fingerprint door locks are just some of the ways in which biometric data has been used in recent years. The constant barrage of news of cyber-threats, has sparked a renewed interest in biometrics: DNA matching, visual biometrics (retina, iris, ear, face fingerprint, hand geometry), spatial biometrics (finger geometry, hand geometry, signature recognition), auditory biometrics (voice authentication or identification), olfactory biometrics (odour), behavioural biometrics (gait, typing recognition) and biometrics based on brain and heart (drawing on certain brain and heart patterns unique to each individual) are just some of the possible technologies being discussed. In Asia, the uptake of biometric technology includes the development of palm vein authentication technology for payments in Japan, the upcoming introduction in April 2016 in Japan of Biocarts to capture fingerprints and photos of passengers to try and cut down the immigration processing time; fingerprint authentication for ATM transactions in Vietnam; and the launch of facial recognition technology for ATMs in China. Is this the end of long passwords and two-factor authentication systems? Can our memories now take a well-earned break from having to remember frequently changed passwords?

Biometric Data – For or Against ?

In a consumer context, biometric technology can enhance the users' experience by speeding up delivery and allegedly offering increased security. But is a fingerprint scan more secure than traditional password authentication? Fingerprints can be easily "lifted" and used to fool fingerprint sensors to gain access to a device, as a recent incident involving a German politician has shown us.

Outside of the consumer context, there has been an increased uptake in biometric technology to track employee attendance. Such use gives rise to a host of data privacy concerns, particularly due to the nature of the employer-employee relationship where there is inevitably an unequal balance of power.

Biometrics is also attracting a lot of interest as a tool for stepping up national security in an age of hyper-sensitivity over cyber-attacks and cyber-espionage. The possible introduction of facial matching systems relying on stills rather than live CCTV feeds for use by law enforcement and security agencies has sparked controversy in Australia recently due to a 20% margin of error.

The fact remains that regardless of the benefits of biometrics, the collection of such sensitive data in itself makes the individual vulnerable to a different type of threat, namely misuse, theft, leakage of data or, sometimes, an erosion of human dignity. Unlike passwords, which can be reset when hacked, biometric features when stolen – cannot be replaced.

Biometrics and the Evolution of Data Privacy

Biometric data can be used to identify the individuals from whom it was collected. While such data can be stored on a device or card retained by the individual, the data is also recorded in a central database (many biometric applications have functionalities dependant on such central databases). Should such data be freely collected and how can data subjects be assured that their biometric data will not be misused? Who has access or who can gain access to the central database recording the biometric data of individuals? The Aadhaar project in India, the most ambitious biometrics database in the world, involves the gathering of fingerprints, iris scans and photos of Indian citizens living in India. The project has been beset by criticism because of the need to counter-balance such an exercise with adequate protection of the individual's privacy; a hard thing to achieve absent any over-arching privacy or data privacy protection in the country.

Yet, the seduction of technology can be giddying: just think of Mastercard's thumbprint biometric card (no more passwords or pins!), Tesco's facial recognition advertising screens (face detection cameras in the screen determine the gender and age of customers and depending on location and time of day allows customisation of ads), apps using facial recognition to authorise payments (an individual's features are his pin for purchases), and eyeball selfie scanning on the phone (eye-recognition technology for personal banking apps on one's smartphone).

It is no surprise that the collection and use of biometric data has led to heightened public and regulatory concern about the risks posed by such practices to data privacy. Most countries, however, do not have specific laws that solely address the collection and use of biometric data, other than general provisions in data privacy laws; a round hole for a square peg.

The data privacy laws of many jurisdictions in Asia-Pacific do not expressly define or clearly refer to biometric data. Instead, biometric data which falls within the definition of "personal data" or "personal information" (e.g., data from which it is practicable to identify an individual) are subject to the existing general data privacy laws.

Some jurisdictions, such as Malaysia and Australia, have additional protections and restrictions regarding the collection and use of "sensitive data" or "sensitive information". Restrictions include a prohibition on collection and use unless the relevant individual has given her explicit consent or one of the exemptions apply (e.g., such use is necessary for medical reasons, etc).

Australia is one of the few jurisdictions that specifically refers to biometric data in its data privacy legislation. Under the Australian Privacy Act 1988 (as amended up to Act No. 62, 2015), "sensitive information" is defined to include "biometric information that is to be used for the purpose of automated biometric verification or biometric identification" and "biometric templates". Biometric information or biometric templates cannot be collected unless the individual has provided his/her consent to the collection, and the information is reasonably necessary for the purposes of a function or activity of the data user. Biometric data cannot be disclosed by the data user for use for any purpose other than one directly related to the original purpose of collection, or only if the relevant individual has expressly consented to the disclosure or the disclosure comes within a specified exemption. Biometric data collected in but transferred outside of Australia, remains subject to the provisions of the Australian Privacy Act 1988, which has extra-territorial effect.

Australia was also the first Asia Pacific jurisdiction to have in place a biometric guideline. In July 2006, the Australian Privacy Commissioner approved the binding nature of the Privacy Code issued by the Biometrics Institute1, under the sponsorship of the Australian Government.

After six years in operation, the Privacy Code, the Privacy Code was revoked in 2012 at the request of the Biometrics Institute due to the changes in technology and the privacy environment since the Privacy Code was drafted.

Hong Kong and Singapore data privacy legislations have no separate definition of sensitive data. In both Hong Kong and Singapore, biometric data would likely fall within the scope of personal data, and be protected under the respective data privacy laws if an individual can be identified from the biometric data itself, or if such data used in conjunction with any other information to which the organisation has access can serve to identify the respective individual.

By contrast, in 2012, the EU Article 29 Working Party2 issued the Opinion 03/2012 on developments in biometric technology ("EU Biometric Opinion"). The purpose of the EU Biometric Opinion was to update the general guidelines and recommendations on the implementation of the data protection principles in relation to biometric data. The Working Party found that most biometric data amounted to personal data, and that its use therefore had to be made in accordance with the data protection principles of the EU Directive 95/46/EC on the protection of personal data ("Data Protection Directive"). Once this was established, the EU Biometric Opinion provided unsurprising guidelines, such as that biometric data should only be collected and processed if: (i) informed consent is freely given by the data subject; (ii) the processing is necessary for the performance of a contract to which the data subject is a party, and only where biometric services are being provided; (iii) the processing is required for compliance with a legal obligation; or (iv) where the processing is necessary for the legitimate interests of the data controller, but only if such prevails over the data subjects fundamental rights and freedoms (e.g., to minimise high security risks that cannot be achieved by alternative less invasive measures). Obligations for data controllers to clearly define the purpose for which they collect and process biometric data were included, as were requirements to limit it to what is proportionate and necessary. The EU Biometric Opinion also emphasises the importance of ensuring that data subjects are adequately informed about the key elements of how their biometric data is processed.

In addition to addressing the data users obligations in the context of the Data Protection Directive, the EU Biometric Opinion outlined various types of biometric technology and the specific risks posed by each, as well as technical recommendations aimed at protecting the biometric data being processed.

That was the general landscape as of April 2012. In April 2015, the EU's Court of Justice missed their opportunity in the Willems3 case to require member states to call into question their practices on use and access to a central database of biometric data.

The Willems case concerned the EU Regulation4 requiring Member States to collect and store biometric data (including fingerprints) in passports and other travel documents, for the purposes of verifying the authenticity of the document or the holder's identity. The fundamental question in Willems was whether or not the Regulation, together with the Data Protection Directive and the Charter of Fundamental Human Rights, requires Member States to guarantee that any biometric data collected and stored pursuant to the Regulation will not be used for other purposes. The EU Court of Justice found that the Regulation did not expressly prohibit the Member States from using the biometric data for any other purpose, and that this issue needed to be determined at national level.

The question of obligations in respect of such biometric data may for now be seen as being sovereign obligations but given recent terrorist attacks in Europe and elsewhere, this question is likely to be revisited before too long.

Closer to Home – Hong Kong

In Hong Kong, the biggest collector of biometric data is the Hong Kong government. All Hong Kong residents have their finger print data stored on their Hong Kong identity cards. A new smart biometric identity card, for which the Hong Kong government has set aside a whopping budget of HK$2.9 billion, is expected to be introduced in phases between 2018 and 2022. The new smart(er) ID card will store higher resolution images for facial recognition and other enhanced biometric data.

Apart from this, Hong Kong has witnessed an increased adoption and use of biometric technology by the private sector. A few recent instances of misuse of biometric data, raised concerns with the former Hong Kong Privacy Commissioner ("PC"), especially in an employment context. On 20 July 2015, the outgoing PC, just days before completing his term in office, issued a Guidance on Collection and Use of Biometric Data ("Guidance Note")5.

Sensitive Data and Biometric Data in the Hong Kong Context

Even before the issuance of the recent Guidance Note, the previous PC tended to take a stricter approach on the application of the Data Protection Principles ("DPPs") under the Hong Kong Personal Data (Privacy) Ordinance ("PDPO") in respect of personal data that he considered to be "sensitive", taking into account the nature of the information (see various guidance notes and reports of investigations issued and conducted by the PC in the last couple of years). Some examples of personal data that are generally considered to be "sensitive" data, include Hong Kong identity card numbers, medical records and biometric data.

During the consultation period for the Amendment Ordinance 2012 (which introduced changes to the PDPO), the government considered introducing a new category of "sensitive data" (which included biometric data) with more stringent controls attached. Due to a lack of consensus on the coverage and regulatory model for the protection of sensitive data, the proposal was not pursued6. We note in passing that many representatives from the information technology sector strongly opposed the proposal lest it would hamper the development of biometric technology7. While the proposal to introduce a new regime to protect "sensitive data" and, particularly, biometric data, was set aside, the government suggested that the PC issue guidelines on best practices on the handling of biometric data, in order to afford better protection to individuals8.

On 20 July 2015, the Guidance Note9 was issued in the wake of several cases that raised public concern on the collection of DNA and fingerprints by employers. This was almost the swan song for the former PC before his term finished on 3 August 2015. The Guidance Note replaces the Guidance Note on the Collection of Fingerprint Data issued in May 2012.

Bits of Us: Hong Kong Cases Relating to Biometric Data

One of the cases that prompted the issuance of the Guidance Note concerns an investment company, which in May 2014 made headlines when it required all female staff to provide blood samples for DNA testing in a misguided attempt to investigate toilet hygiene complaints. On 21 July 2015, the former PC issued an investigation report regarding the collection of employees' fingerprint data by a fashion trading company. In both cases, the former PC found that the collection of such data was excessive, as the sensitive nature of the data was disproportionate to the purpose of collection, and less privacy intrusive measures were available.

In an employer-employee context, even if the collection of biometric data may be justified and proportionate, alternative options should still be provided to the employee (e.g., choice of password access instead of fingerprint scan), otherwise the employees' consent on the collection and use of their biometric data cannot really be said to be voluntary or "fair" for the purposes of the PDPO.

Guidance Note

The Guidance Note (which is reminiscent of the EU Biometric Opinion) provides practical guidance to data users on the limited circumstances when biometric data may be collected and, if it can be collected, the steps that need to be taken regarding the collection and storage of such data, namely:

  1. Biometric data should only be collected and used in accordance with the relevant data privacy law;
  2. There must be a clear legal purpose for which the biometric data is being collected;
  3. Biometric data must only be collected and used if it is relevant and not excessive in order to achieve such purpose;
  4. An analysis should be conducted to determine whether the proposed biometric technology is essential to and will be effective to achieve the relevant purpose, and whether there are less privacy intrusive alternatives;
  5. Sufficient and effective security measures should be implemented to protect the biometric data, taking into account the sensitive nature of the data; and
  6. The data user should establish a retention period, and should ensure that biometric data is deleted once it is no longer needed for the purpose in which it was collected.

The Guidance Note, like the EU Biometric Opinion, goes into further detail and provides practical advice and examples in order to assist data users in their handling of biometric data. In brief, this advice includes the following:

  1. Fully inform individuals of the privacy risks and issues involved in the collection of their biometric data, and whether the biometric data may be relied upon to take adverse action against them;
  2. Only use and disclose the biometric data for the purposes in which it was originally collected and notified to the data subjects, unless one of the exemptions apply or the explicit consent of the data subject is obtained;
  3. Enter into contracts with service providers who receive the biometric data, to ensure that the data is not retained longer than necessary and is kept secure10;
  4. Implement internal policies to ensure that employee-access to the biometric data is restricted and regular training is provided, and take disciplinary actions against any breaches of those policies; and
  5. Regular and frequent reviews of the biometric data held by them to ensure that any unnecessary biometric data is purged.

Conclusion : No Sweat , No Tears ?

Whilst there are no specific laws in the Asia Pacific region which regulate the collection and use of biometric data, given the increasing adoption of biometric technology, further guidelines and regulations are likely to follow throughout the Asia Pacific region.

Hong Kong is one of the first Asian countries where a regulator has issued specific guidelines on the collection and use of biometric data. Even though the Guidance Note is not legally binding and a breach of its provisions will not in itself constitute an offence, the PC will likely take into account any data users non-compliance with the Guidance Note when determining whether or not there a breach of the PDPO and DPPs has occurred. If after an investigation the PC finds that there has been a breach of the DPPs, it may issue an enforcement notice requiring the data user to take certain remedial action. Failing to comply with an enforcement notice will amount to an offence, resulting in a fine of HK$50,000 and 2 years imprisonment (plus a daily fine of HK$1,000 if the offence continues). However, an even bigger concern for data users is the PC's ability to name-and-shame organisations that have breached the PDPO, which can result in irreperable reputational damage.

Advancements in biometric technology have rendered every day transactions more convenient and more efficient. As more and more "bits" of us are being captured, compressed, encrypted and used to enable daily transactions, what safeguards do we need to have in place, and how will these safeguards differ from one place to another?

The recent Willems case in Europe highlighted the confluence of concerns regarding biometric data: data privacy and human rights set against national or international obligations regarding the collection of such data. Add to this mix cyber security concerns and more questions arise.

It seems that in the meantime, more blood, sweat and tears will need to be shed to achieve the elusive balance between improving efficiency and greater security through the use of biometric data, versus safeguarding personal privacy, human dignity and, ironically, the security of such data.

Footnotes

1. The Biometrics Institute was founded in 2001as an independent and impartial international forum, with the aim of promoting the responsible use of biometrics. The Biometrics Institute has offices in Australia and the UK.

2. Article 29 Working Party was set up under the Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data.

3. C-446/12 – 449/12 Willems.

4. Council Regulation No 2252/2004/EC

5. https://www.pcpd.org.hk/english/resources_centre/publications/files/GN_biometric_e.pdf

6. The Report on Public Consultation on Review of the Personal Data (Privacy) Ordinance issued in October 2010 by the Hong Kong government: http://www.cmab.gov.hk/doc/issues/PCPO_report_en.pdf

7. Ibid 2.

8. Ibid 2.

9. https://www.pcpd.org.hk/english/resources_centre/publications/files/GN_biometric_e.pdf

10. DPP 2(3) and DPP 4(a). See also https://www.pcpd.org.hk/english/resources_centre/publications/files/dataprocessors_e.pdf

Originally published 18 December 2015

Learn more about our Cybersecurity & Data Privacy, Intellectual Property and Technology, Media & Telecommunications practices.

Visit us at www.mayerbrown.com

Mayer Brown is a global legal services organization comprising legal practices that are separate entities (the Mayer Brown Practices). The Mayer Brown Practices are: Mayer Brown LLP, a limited liability partnership established in the United States; Mayer Brown International LLP, a limited liability partnership incorporated in England and Wales; Mayer Brown JSM, a Hong Kong partnership, and its associated entities in Asia; and Tauil & Chequer Advogados, a Brazilian law partnership with which Mayer Brown is associated. "Mayer Brown" and the Mayer Brown logo are the trademarks of the Mayer Brown Practices in their respective jurisdictions.

© Copyright 2016. The Mayer Brown Practices. All rights reserved.

This article provides information and comments on legal issues and developments of interest. The foregoing is not a comprehensive treatment of the subject matter covered and is not intended to provide legal advice. Readers should seek specific legal advice before taking any action with respect to the matters discussed herein. Please also read the JSM legal publications Disclaimer.