Credit scoring is automated decision-making according to CJEU

On 7 December 2023, the Court of Justice of the European Union ('CJEU') ruled that credit scoring constitutes automated decision-making, a practice which is prohibited under Art. 22 of the EU General Data Protection Regulation ('GDPR') unless specific conditions are satisfied. This ruling arose from consumer complaints directed at the German credit bureau, SCHUFA. The CJEU determined that the company's reliance on fully automated processes to assess creditworthiness and grant credit qualifies as automated decision-making, resulting in a legal or similarly significant impact as per the provisions of Art. 22.

Art. 22 prohibits the use of personal data for entirely automated decision-making, leading to legal or 'similarly significant' consequences for data subjects, unless the data subject provides consent to the automated processing, or certain other conditions (such as necessity for contract performance) are met.

Despite SCHUFA's stance that credit scoring does not amount to decision-making since any adverse effects on the data subject stem from the independent decisions of the entity using the score, the CJEU rejected this argument. Instead, the Court held that a credit agency's computations of creditworthiness fall within the scope of automated decision-making under Art. 22 if a third party "draws strongly on that [score] to establish, implement or terminate a contractual relationship."

The CJEU has assigned the Administrative Court of Wiesbaden in Germany, the jurisdiction from which the case originated, the responsibility of determining whether German federal law includes a GDPR-compatible exception to the prohibition on automated decision-making. In the absence of applicable exceptions, credit scoring agencies in the EU would be obligated to obtain explicit consent from consumers before evaluating their creditworthiness and must afford consumers the opportunity to object to the credit score provided.

UK ICO publishes consultation on employment guidance

On 12 December 2023, the UK Information Commissioner's Office ('ICO') announced its intention to develop online resources pertaining to employment practices and data protection. The ICO further outlined its plan to release draft guidance on various topic areas encompassed in the resource incrementally, with ongoing additions over time. Two initial areas, 'Keeping employment records' and 'Recruitment and selection,' have been released in draft form for public consultation. The former seeks to furnish guidance on adhering to data protection laws when maintaining records about employees, aiming to foster compliance and advocate for best practices. The latter draft aims to provide direction on compliance during recruitment exercises. Both drafts also incorporate practical tools, including checklists, designed to aid employers.

Interested parties can access the consultations at the following links: Keeping employment records and Recruitment and selection. The consultation period is scheduled to conclude at 5pm on 5 March 2023.

S. 166 of the Data Protection Act 2018 cannot compel UK ICO to change complaint outcome

In April 2023, a data subject lodged a complaint with the UK Information Commissioner's Office ('ICO') alleging that a bank ('the controller') had failed to update their systems, resulting in the erroneous transmission of personal banking data to an incorrect address. Despite informing the controller of this error, letters and cheques continued to be sent to the data subject at the wrong address.

The ICO responded to the complaint in May 2023, indicating satisfaction with the controller's response and the handling of the data subject's personal data. Subsequently, the ICO closed the case. When the data subject requested to see the controller's response, the ICO declined, citing that it had been provided by the controller's Data Protection Officer for investigative purposes.

In June 2023, the data subject appealed the ICO's decision to the First Tier Tribunal under s166 of the UK Data Protection Act ('DPA 2018'), alleging that the ICO had not taken reasonable steps to resolve the complaint, including the fact that it took 26 days for the ICO to reply to the complaint. The data subject sought to have her complaint addressed and receive all personal data related to her accounts held with the controller.

The Tribunal dismissed the data subject's appeal, due to the appeal being outside the Tribunal's jurisdiction. The reasons for this were as follows:

  1. S.166 DPA 2018 does not afford data subjects the right to appeal against the merits of the Information Commissioner's decision. The Tribunal clarified that its jurisdiction is procedural, focusing on instances where the Commissioner fails to respond appropriately to a complaint or update a data subject on its progress
  2. The High Court's decision in Delo v The Information Commissioner emphasised the ICO's wide discretion in handling complaints, implying that it is not the Tribunal's role to dictate the investigative process.
  3. The Tribunal noted that the data subject's use of s. 166 to seek a different complaint outcome had been criticised in previous cases. Since the desired outcome involved obtaining copies of personal data, the Tribunal stated it could not grant this within the confines of s. 166.
  4. Lastly, the Tribunal highlighted that, based on existing case law, the data subject could only obtain the remedy sought by applying for judicial review in the High Court.

This decision contrasts with recent EU case law, such as CJEU – C‑333/22 – Ligue des droits humains ASBL, BA v Organe de contrôle de l'information policière, which ties information provided by a supervisory authority with the right to an effective remedy against legally binding decisions. Rather than following the EU case law, the Tribunal instead followed the recent UK case law, Delo, R (On the Application Of) v The Information Commissioner [2023] EWCA Civ 1141.

Furthermore, the decision deviates from recent case law like CJEU – Joined Cases C‑26/22 and C‑64/22 – SCHUFA (addressed above), which states that a legally binding decision of a supervisory authority is subject to full substantive judicial review. However, the ICO requires data subjects to 'to lodge appeals with the First Tier Tribunal (Information Rights) within 28 calendar days.'

This case shows that the Tribunal is limited in what decisions it can review, and therefore there is a concern regarding data subjects' effective remedy rights if they are restricted in their ability to approach the High Court directly or need to initiate separate legal proceedings separately to bringing an action before the Tribunal.

European lawmakers reach a political agreement on the AI Act

Although the final text of the AI Act (the 'Act') has not been published, and no official statement has been released addressing the recent agreement by the European Parliament on the anticipated Act, reports suggest that some of the key terms of the Act will include:

  1. Certain AI systems will be prohibited as they present unacceptable risks (e.g., AI used for social scoring based on social behaviour or personal characteristics, untargeted scraping of facial images from the Internet or CCTV footage to create facial recognition databases, etc.);
  2. AI systems presenting a high-risk to the rights and freedoms of individuals will be subject to stringent rules, which may include data governance/management and transparency obligations, the requirement to conduct a conformity assessment procedure and the obligation to carry out a fundamental rights assessment;
  3. Limited-risk AI systems will be subject to light obligations (principally transparency requirements); and
  4. AI systems that are not considered prohibited, high-risk or limited-risk systems will not be within the scope of the AI Act.

The fines for breaches of the AI Act will be similar to those of the GDPR:

  1. €35 million or 7% of annual global turnover for violations of the prohibition on certain AI applications;
  2. €15 million or 3% of annual global turnover for violations of the AI Act's obligations; and
  3. €7.5 million or 1.5% of annual global turnover for the supply of incorrect information to regulators.

The Act is expected to come into force in the early part 2024, with a transition period of between 6-24 months for compliance with its requirements.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.