1356060a.jpg

As mentioned in a last-minute entry to our June Data Wrap, on 10 July 2023, the European Commission adopted its long-awaited adequacy decision for the EU-US Data Privacy Framework ("DPF"), determining that data transfers pursuant to the DPF benefited from an adequate level of data protection. This means that personal data can now freely flow to organisations participating in the DPF without the need for transfer impact assessments or the EU standard contractual clauses ("SCCs), for example.

Whilst the adoption of the adequacy decision provides some much-needed certainty around EU-US international data transfers (in the near term at least), there is still a significant risk that the adequacy decision could be challenged, amended and/or withdrawn in due course, and may end up being a more interim measure.

For those wanting to rely on the DPF, it may therefore be prudent to retain (or, if not already in place, contractually provide for) alternative data protection safeguards such as SCCs or BCRs. We expect most organisations to take a "wait-and-see" approach for now, although we may start to see greater reliance on the DPF once milestones such as the first European Commission review and the outcome of the potential "Schrems III" challenge, have taken place.

Whilst US data importers are already able to self-certify to the DPF plus the so-called "UK extension", UK data exporters cannot rely on this mechanism for transfers until the UK government has issued its own adequacy decision in respect of the DPF and UK extension.

For a deeper dive into the analysis and practical implications of the EU – US adequacy decision please refer to our fuller blog post here.

1356060b.jpg

On 4 July 2023, the European Commission (the "Commission") adopted a proposal setting out further procedural rules for the GDPR (the "Proposed Regulation"). This comes after the Commission found in its report following two years of the GDPR's application, that there was disparity between Data Protection Authorities' ("DPAs") interpretations of GDPR concepts and procedures.

The decentralised approach which provides member state citizens with local DPAs should facilitate citizens' engagement of their data rights, but this has come at the cost of consistency in the GDPR's application across different member states. The Proposed Regulation should provide DPAs with procedural rules for cases in which affected individuals are from more than one member state. One such rule is an obligation for the lead DPA to send a key issues summary to the other interested DPAs, setting out the main aspects of the case and the lead DPA's views.

The Proposed Regulation also introduces new rules for individuals, outlining the information they will need to submit when making a complaint. It will also provide businesses with more clarification in relation to their due process rights when undergoing investigation by DPAs for a potential GDPR breach.

The Commission has invited public feedback on the Proposed Regulation by 4 September 2023.

1356060c.jpg

The Court of Justice of the European Union ("CJEU") ruled on 4 July 2023 that Meta should not have relied on its "legitimate interests" when processing its users' personal data for target advertising purposes, and should instead rely on users' prior consent. The judgment means that other large technology companies collecting large volumes of personal data from their users to provide target advertising should also consider revisiting any "legitimate interests" basis for processing that data as well.

In January 2023, the Irish Data Protection Commissioner ordered Meta to reassess the legal basis on which it relied for its target advertising activities, having fined the company €390 million for unlawfully gathering users' personal data to personalise advertisements pursuant to its terms of service (under the " necessary for the performance of a contract" legal basis in the GDPR). Consequently, in April this year, Meta changed the legal basis on which it relied to process personal data to "legitimate interests" instead. However, now that the CJEU's latest decision has ruled this out, Meta will either need to obtain users' prior consent to use their personal data for target advertising purposes or be forced to cease these practices when handling EU users' personal data.

Meta has since confirmed its intention to change the legal basis used to process "certain data for behavioural advertising for people in the EU, EEA and Switzerland" to "Consent" and that it will share further information on how the process will work in practice over the coming months following further engagement with regulators.

1356060d.jpg

A judgement handed down on 26 July 2023 by the Supreme Court("SC") has set a new precedent which will affect third party funders of class actions. The SC held that funding agreements for matters which dictate that a third party will receive a share of the awarded damages, are classed as "damages-based agreements" ("DBAs") for the purposes of the relevant legislation.

If a DBA does not comply with the appropriate regime, it will be deemed unenforceable and this is expected to be the case for most third-party litigation funding agreements which are currently going through, or are set to undergo, the litigious process in the English court system. This new classification will also render unenforceable those DBAs which govern opt-out collective proceedings in the Competition Appeal Tribunal ("CAT") where the funder's share of the damages is calculated by reference to a proportion of the damages ultimately recovered.

The ruling has the potential to dramatically affect opt-out class actions brought in the near future as the relevant practice areas adjust to incorporate compliance with the DBA regime into drafting considerations for third-party litigation funding agreements. It remains to be seen how funders in existing agreements will renegotiate terms to comply with DBA statutory requirements, if at all.

For further information please refer to our blog post here.

1356060e.jpg

On 18 July 2023, the House of Lords ("HoL") published a report – "Artificial intelligence: Developments, risks and regulation", providing analysis of the technology and alternative regulatory approaches available to the UK in comparison to other jurisdictions. In particular, recommendations included: (i) initial divergence from the EU AI Act whilst also enabling voluntary alignment by UK companies with the EU AI Act (to enable exports); (ii) in the near term, alignment with US regulatory standards, while building a coalition with other territories via Sentinel (a suggested national AI lab to research and test safe AI), enabling later divergence as UK regulatory expertise becomes more mature; and (iii) in the medium term, establishing an AI regulator in tandem with Sentinel.

Against the backdrop of recent criticism of the UK's proposed pro-innovation, sector-led and principles-based approach to AI-specific regulation (including from the UK Equalities and Human Rights Watchdog), the HoL's Communications and Digital Committee (the "Committee") launched an inquiry into large language models ("LLMs") on 7 July 2023. The window for written contributions closes on 5 September 2023. The Committee hopes to ascertain the trajectory of LLMs over the next three years and whether the UK's regulatory approach should be amended to ensure it is sufficiently prepared to respond to the opportunities and risks posed by AI technologies.

For further detail of the UK's proposed approach to regulating AI see our blog here and for an overview of international regulators' / authorities' approach to AI-specific regulation, refer to our recent podcast "AI booms – whilst regulation looms" here.

1356060f.jpg

In the EU, the trialogue negotiations for the EU AI Act began in mid-July between the European Parliament and Council of the EU. At the time of writing, further trilogue negotiations are expected to take place towards the end of September, October and November. Whilst the Spanish Presidency is keen to finalise the legislative framework within that time frame, this seems ambitious given the challenging areas still to be resolved.

In parallel, the European Commissioner is working with industryto propose a set of voluntary principles or standards underlying the EU AI Act that could apply in the meantime (known as the "AI Pact"). The voluntary principles would be much like the recent voluntary commitments in the US from seven leading AI developers, (including Amazon, Google, Meta, Microsoft and OpenAI).

In the US, the Federal Trade Commission ("FTC") has also begun an investigation into OpenAI's data privacy practices. The ChatGPT provider received a civil investigative demand in the week commencing 10 July on the grounds that its privacy practices may be unfair or deceptive. OpenAI will be required to pass on information concerning the impact of its use of AI on consumers, including the effect of false or harmful statements, among other information.

In China, on 13 July 2023 the Cyberspace Administration of China ("CAC") and six other central government regulators issued the final version of the Interim Measures for the Management of Generative Artificial Intelligence Services to support responsible use of generative artificial intelligence. This follows the CAC publishing the draft measures for public consultation in April 2023. The measures themselves are now due to take effect on 15 August 2023. With China quick off the mark to regulate AI, the measures will join other existing AI-specific legislation around recommendation algorithms and deep synthesis technology.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.