UK Information Commissioner releases guidance on Workplace Monitoring

Earlier this month, the UK Information Commissioner's Office ('ICO') published new guidance on lawful monitoring of employees in the workplace. The guidance is designed to help employers comply with their obligations under the data protection legislation – the UK GDPR and the Data Protection Act 2018 ('DPA').

The Guidance emphasises that employers must:

  • Comply with the data protection principles of the UK GDPR, regardless of which monitoring technology is being used;
  • Choose the least intrusive means possible to achieve the purpose of the monitoring; and
  • Identify a permitted purpose for which any special category data is processed, as set forth in Article 9 of the UK GDPR.

The purpose of the guidance is said to be protecting workers' data protection rights, and helping employers build trust with their workers, customers and service users. The guidance addresses monitoring taking place both on premises and remotely, and (notably) both within and outside of work hours. The guidance states that those working from home are likely to have a higher expectation of privacy.

The guidance emphasises the importance of monitoring only in a way that employees would expect – examples of this would be monitoring work emails and internet searches using the company network. The ICO recommends that employers undertake impact assessments to establish the impact of such monitoring, which should include seeking the views of employees and acting on concerns and recommendations.

Québec Private Sector Act Comes into Force

On September 22 2023, most of the changes brought to Québec's Act respecting the protection of personal information in the private sector (the 'Private Sector Act') by the recently passed 'Law 25,' will come into force.

Taking inspiration from the GDPR, the Private Sector Act will now include provisions covering:

  • Consent (which may be implied) as the only legal basis for the collection, use, and communication of personal information (unless an exception applies);
  • Transparency requirements for automated decision-making (both artificial intelligence and other types of algorithms);
  • Obligatory privacy impact assessments in certain situations;
  • Additional clarity and transparency requirements with regard to privacy and data processing activities;
  • New obligations on data controllers relating to the use of location-tracking and profiling technologies; and
  • New requirements for the transfer of personal information outside Québec.

These changes follow a previous set of amendments that came into force a year earlier, which included:

  • The requirement to name a person in charge of the protection of personal information (similar to the role of data protection officers seen in some other jurisdictions);
  • Obligations related to data breaches (known as 'confidentiality incidents' in Québec law); and
  • New powers for the province's data protection regulator, the Commission d'accès à l'information du Québec.

Failure to comply with the Private Sector Act can lead to significant penalties, with the maximum fine being the greater of CAD 25 million, or four percent of worldwide turnover, the steepest such penalties by a wide margin across the Canadian provinces.

Proposal on UK-US Data Bridge Agreed

The Regulation regarding the UK-US Data Bridge (the 'Regulation') came into force on October 12th 2023; it is intended to reduce the regulatory burden for organisations transferring personal data between the UK and the United States.

Prior to the Data Bridge coming into force, the lack of a UK 'adequacy' decision in relation the US meant that UK businesses were required to implement specific contractual provisions and undertake a detailed risk assessment prior to each transfer.

The Regulation piggybacks onto the US-EU Data Privacy Framework, which was approved by the European Commission in July (the 'Framework'). Under the Framework, US based businesses can opt-in to a certification regime enforced by the US Federal Trade Commission and Department of Transportation.

The UK Secretary of State for Science Innovation, and Technology, said of the data bridge:

'Data bridges not only offer simpler avenues for the safe transfer of personal data between countries, but also remove red tape for businesses of all sizes and allow them to access new markets.'

Fortunatley, the court has refused a request to suspend the Framework immediatley, pending determination of the legal challenge as discussed in our last Data Blast (here), the Framework is already being challenged before the EU's General Court. If the Framework is struck down, then the UK-US Data Bridge will fall with it. It remains to be seen whether the UK-US Data Bridge will be subject to a parallel legal challenge.

French Regulator finds Scientific Research to be in the Public Interest

The National Institute of Demographic Studies ('INED') planned to conduct a 'Families and Employers longitudinal survey project' (the 'Survey Project') which would involve processing respondents' sensitive personal data. The purpose of the survey project was to collate statistical data relating to the balance between the professional, family and personal lives of respondents. The data could then be used to analyse the impact of the interrelations between these aspects on life outcomes (including life expectancy) and to understand the risk factors arising from different professions, genetics, and domestic circumstances.

INED planned to rely on the public interest legal basis for processing data (Art. 6(1)(e) GDPR), but made a pre-emptive request to the French data protection regulator (the 'CNIL') under Art. 36 GDPR to seek confirmation that this was lawful.

The CNIL gave the following opinion:

  • Processing sensitive data relating to the health, sexuality and religion of the data subjects for the Survey Project is in the public interest and therefore permissible;
  • For the data to be transferred to third parties, it must be in fully anonymised form – INED had planned only to pseudonymise the data – and must be relevant and limited to what is necessary in relation to the purposes for which the data is being processed;
  • With regard to storage and retention, INED's policy was to retain the data until ten years after date of the last request to access the results of a given survey, following which it would be archived. The CNIL accepted this policy but objected to INED's retention of a copy of the data after archiving;
  • Security measures must be taken in line with Art. 5(1)(f) and 32 GDPR, and automatic or manual monitoring of every data transfer outside INED must be undertaken, and any transfers outside the EU must comply with Chapter V GDPR;
  • INED must provide information to respondents by both letter, and email or text message, and repeat the information prior to the respondent completing the online survey;
  • INED needed a simple way for respondents to object to continued data processing, with both online and offline options.

Romanian Regulator Fines Market Research Company for Unsolicited Email

The Romanian data protection regulator ('RDPA') has fined a market research company for sending an unsolicited message to an e-mail address collected from a public source for the purpose of market research.

A data subject complained to the RDPA after receiving an unsolicited email. The data subject responded to the email, objecting to the processing of her data, but continued to receive similar emails. The RDPA found that the company had collected the data subject's personal data (i.e. name, surname, e-mail address, place of work) indirectly from publicly-available sources and used it for the purpose of conducting market research.

The RPDA found breaches of Arts. 14(2)(f), 17(1)(c) and 21(1) GDPR on the following grounds:

  • The controller did not inform the data subject that it had collected her data from publicly available sources – this was a failure to comply with the obligation to provide clear and complete information to the data subject; and
  • The controller did not facilitate the data subject's right to object to the processing of her data, given that the controller continued to send emails to the data subject following her objections.

Accordingly, the RDPA fined the company RON 9,898 (c. EUR 2,000) and ordered it to bring its processing in line with the GDPR. The public statement issued by the RDPA does not state whether ePrivacy compliance was also considered.

Irish County Council in breach of GDPR for Surveillance

Of its own volition, the Irish Data Protection Commission ('DPC') opened an inquiry into Galway County Council (the 'Council') and An Garda Síochána (the Irish police) concerning the use of surveillance technologies. The particular technologies in question were CCTV systems, body-worn cameras and automated number plate recognition ('ANPR').

The scope of the DPC's inquiry was to assess the legitimacy of the processing activities in light of the GDPR, Law Enforcement Directive ('LED') and the Irish Data Protection Act ('DPA'). The DPC found that only the GDPR and DPA apply to body-worn cameras and ANPR, as they do not serve law-enforcement purposes, but rather health and safety and traffic management purposes, respectively. In order to fall within the scope of the LED, processing must be specifically and concretely used for law enforcement purposes; it is not sufficient that the data could potentially be processed for such purposes.

The DPC found that ANPR may enable identification of data subjects within the vehicles in question (i.e. drivers and passengers), and therefore involves the processing of personal data. The Council relied on Art. 6(1)(e) GDPR (public interest) as its legal basis for data processing using ANPR. The DPC agreed that the use of ANPR cameras does aid the traffic management function of officials, but it may also have significant impacts on the rights and freedoms of data subjects, and therefore the Council could not rely on public interest as the legal basis for processing without having conducting a balancing exercise between the public interest and the impact on individuals' rights. Accordingly, the processing was in breach of the GDPR. The DPC was critical of the Council for not undertaking a data protection impact assessment with regard to the use of ANPR, as required by Art. 35(1) GDPR.

The Council also relied upon Art. 6(1)(d) GDPR (protecting the vital interests of an individual) for the use of a body-worn camera when the wearer was threatened. The DPC found that the Council failed to carry out an assessment as to the necessity of body-worn cameras prior to their use. This meant that the Council could not prove that such use was in the public interest, and could not rely on Art. 6(1)(d) for such processing, either.

Accordingly, the DPC has banned the use of body-worn cameras and APNR until a valid legal basis is identified for the processing, and also issued the Council with a reprimand for breaching Art. 24(1) GDPR, which requires the Council to maintain organisational measures to raise staff awareness of the legal principles of data processing.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.