What are autonomous cars?

As clarified by the resolution of the EU Parliament dated February 16, 2017 (the Resolution), which sets out recommendations to the Commission on Civil Law Rules on Robotics (2015/2103(INL)), autonomous transport covers all forms of remotely piloted, automated, connected and autonomous ways of road, rail, waterborne and air transport, including vehicles, trains, vessels, ferries, aircrafts, drones, as well as all future forms of developments and innovations in this sector. This obviously includes "autonomous cars" (also called "self-driving cars"), which can operate in full autonomy or under the driver's supervision. Such cars make use or are equipped with increasingly complex AI systems, raising a number of legal concerns. Such concerns range from issues relating to the use of (personal) data collected by the vehicle, to the identification of the liability of the stakeholders involved (e.g. the developer of the software, the producer of the vehicle, the driver, etc.).

As of today, only few a jurisdictions in Europe—such as Germany and the UK—have adopted specific rules with regard to self-driving cars, whilst case law remains extremely limited.

What about the data?

Self-driving cars rely on a large number of data to work properly and to ensure a safe ride. Their collection is possible through sophisticated sensors, high-tech cameras, ultra-precise GPS, radar and black boxes that allow them to collect and process data about the environment (traffic lights, road sign and other obstacles) and about the passangers (e.g. driver/passengers preferences and personal information). In addition, self-driving cars will be (or maybe already are...) able to "talk" to each other, to their manufacturers, to their owners.

While personal data necessary for providing the service can be collected and processed in compliance with European Regulation 679/2016 (GDPR), this wealth of information could be used to predict the users' preferences, to personalize advertising and there is a risk that this information could end up in the wrong hands. People could face substantial privacy and data security risks, in particular there are concerns about potential hackers and therefore also the safety of the passenger. By accessing the driver's location at all times, a hacker could know the perfect time of day to break into a house; by tapping into a self-driving car, hackers could control the vehicle remotely. They could collect, sell and leak personal data to the public, allowing the marketer to use driver/passenger personal information for an extremely precise marketing strategy.

It is therefore essential that the programmers and manufacturers develop systems in compliance with the privacy by design principle and ensure appropriate security measures to keep user data private. In addition, data subjects should receive a complete information notice with regard to the processing of their data and, in case of the use of personal data for purposes other than the one of providing the service, consent from the data subjects must be collected. It is generally advisable for a DPIA to be conducted. To better assess certain privacy concerns please also refer to Artificial Intelligence vs Data Protection: the main concerns and Artificial Intelligence vs Data Protection: which safeguards? .

Liability issue: Who is to blame?

As mentioned in the Resolution, under the current legal framework robots cannot be held liable per se for acts or omissions that cause damage to third parties. Consequently, the national civil liability rules apply in order to identify the liable human agent among those that contributed to the development of the autonomous car, namely, for instance and among others, the:

  • Programmer who developed the AI algorithm for the self-driving car;
  • Manufacturer who produced the car;
  • Owner who bought the car; and/or
  • User who is driving the car when the accident takes place. 

The fact that there are many parties involved does not make it any easier to determine who is liable for the damage caused by the vehicle. In general, in Italy we would take into account two types of liability:

  1. product liability: on the one hand, producers are generally liable under the Product Liability Directive no. 85/374/EEC (this Directive introduces a strict liability regime according to which the injured person does not have to prove a fault of the producer) and under the Italian Consumer Code while, on the other hand, sellers are responsible for the products placed on the market regardless of whether they include third party components (see the EU Commission Working Document on Liability for emerging digital technologies issued on April 25, 2018). 
  2. liability for harmful actions: according to Italian civil laws, a number of "strict liabilities" (responsabilità oggettiva) apply.

 To better understand the multi-layered liability profiles, please see our article related to Robots and Liability: Who is to blame?

Notwithstanding the allocation of liability, victims of an accident caused by a vehicle usually have a direct claim against the insurance of the person responsible, owed to the fact that the Motor Insurance Directive (Directive 2009/103/EC) requires civil liability for the use of vehicles to be covered by insurance. In any case, the Resolution sheds light on the importance of civil law to determine whether to apply strict liability or the risk management approach. 

However, such types of liability do not fully address those cases in which self-driving vehicles operate completely without human intervention. This is probably the reason why the EU Parliament, in its Resolution, considers it essential, in the development of robotics and AI, to guarantee that humans have control over intelligent machines at all times, even though this recommendation collides with the direction taken by the self-driving vehicle industry. Notwithstanding the forthcoming evolution of the civil laws applicable in relation to liability, a possible solution, envisaged by the Resolution, may also involve an obligatory insurance scheme for robotics that takes into account all potential responsibilities in the chain, supplemented by a fund. 

In July 2018 the UK published the Automated and Electric Vehicles Act 2018. The act brought the principles of automated car insurance in line with those of non-automated car insurance practices. It has made sure that drivers are covered both when they are actually driving, as well as in situations where the driver has fully handed control over to the vehicle. The legislation states that an insurer (where the driver is insured insured) or owner (where the driver is uninsured) can be found liable for an accident caused by an autonomous car when the car driving autonomously.

It will be very interesting to see if other European countries (including Italy!) follow the approach taken by the UK.

Do you have more questions, or do you want to share your thoughts on this article? Contact our Dentons Italy TMT Team at tmtbites.italy@dentons.com, and do not forget to sign up to our TMT Bites Newsletter!

Dentons is the world's first polycentric global law firm. A top 20 firm on the Acritas 2015 Global Elite Brand Index, the Firm is committed to challenging the status quo in delivering consistent and uncompromising quality and value in new and inventive ways. Driven to provide clients a competitive edge, and connected to the communities where its clients want to do business, Dentons knows that understanding local cultures is crucial to successfully completing a deal, resolving a dispute or solving a business challenge. Now the world's largest law firm, Dentons' global team builds agile, tailored solutions to meet the local, national and global needs of private and public clients of any size in more than 125 locations serving 50-plus countries. www.dentons.com.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.