Everyday life has changed significantly since artificial intelligence has become a reality. This revolution has influenced our habits both at work and at home. The automatization of banking decisions, a robot that cleans the floor, a coffee machine that prepares coffee autonomously, a machine that prepares meals using the ingredients at hand—many are the innovations now available in the market, and even the most sceptical person may be tempted to transform his/her house into an automated world.

Among all the innovations, the home assistant is one of the most representative, yet it may give rise to a number of issues, mainly connected to the use of the data it collects. Just a few days ago, reports conveyed that a well-known home assistant's recordings were communicated by the developing company to the wrong user; more specifically, a device user had asked to review his archived data and received someone else's conversations. By matching the information collected in the recordings with those in social networks, it was possible to identify the account owner.

Many commentators believe that there is no way to ensure a privacy safe mode while using home assistants. Some also stated, in fact, that the user somehow "wants" to be profiled: The more information the home assistant collects, the more it will provide quick and customized services.

It is however important to be aware of the price to be paid: The user shares data pertaining to his/her private life with a third party. The purpose of existing laws is to ensure that the user controls his/her data and that the controller or a third party do not take an unexpected advantage from such data, but the mere fact that the user welcomes a third party's ear constitutes a vulnerability. In this respect, indeed, the trade-off between security and efficiency is always an open-ended question: A large number of devices still lack sufficient security. For a more detailed analysis of AI and data protection, check out our post Artificial Intelligence vs Data Protection: which safeguards? on this topic.

The risk of home devices interfering with private lives has been recognized by the EU Parliament in its resolution dated February 16, 2017 - which sets out recommendations to the Commission on Civil Law Rules on Robotics (2015/2103(INL).) Under principle 14, the EU Parliament requires that special attention should be paid to robots that represent a significant threat to confidentiality owing to their placement in traditionally protected and private spheres and because they are able to extract and send personal and sensitive data. Also the Italian Data Protection Authority, with reference to the home assistants, has stated that the privacy issue continues to be one of the most controversial and that when we accept to use a series of digital devices and services facilitating our daily life, it is fundamental to do it with consciousness (please see link).

How does a home assistant work and when does it collect data?

Simply by asking the home assistant, one can surf the Internet, listen to music, set timers, control blinds and lights. The functionality of the home assistant is linked to the use of key words. In fact, a system may always be listening, even though it does not collect any sound when shut down. When specific words are pronounced, it starts capturing noises from the surroundings.

It is worth noting that privacy has become a major issue for developers of AI technology. This is why data controllers have spent a significant amount of time, effort and money developing complex systems that allow the user to keep control of the data, in order to reduce their liability. Stakeholders from all over the world are increasingly investing in the AI's value chain. When it comes to home assistants creation, there is a number of players involved, from software developers up to the actual hardware manufacturer distributors and retailers (see our previous post Robots and Liability: who is to blame?). Such involvement may lead to the possibility that some stakeholders do not respect privacy laws. More in particular, certain stakeholders, being a co-controller or a multi data controller, when sharing data may wrongly rely on the other co-controllers for compliance with privacy laws.

Thus, whenever dealing with home assistants from a legal perspective, one of the main issues remains ensuring data protection compliance throughout the whole production and distribution chain, adequately addressing the role of each stakeholder involved.

This article was co-authored by Fabia Cairoli and Valeria Schiavo.

Dentons is the world's first polycentric global law firm. A top 20 firm on the Acritas 2015 Global Elite Brand Index, the Firm is committed to challenging the status quo in delivering consistent and uncompromising quality and value in new and inventive ways. Driven to provide clients a competitive edge, and connected to the communities where its clients want to do business, Dentons knows that understanding local cultures is crucial to successfully completing a deal, resolving a dispute or solving a business challenge. Now the world's largest law firm, Dentons' global team builds agile, tailored solutions to meet the local, national and global needs of private and public clients of any size in more than 125 locations serving 50-plus countries. www.dentons.com.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.