In our previous post on Smart homes and new (IT) housemates (Part 1), we addressed how AI and digital transformation can in fact be applied in our everyday life. Digital home assistants are becoming increasingly popular, involving in their value chain a number of stakeholders, with a number of players involved—from software developers to the actual hardware manufacturers, distributors, retailers, etc.

This may further lead to the data being shared and used for different purposes without the data subject being aware of it.

In fact, the home assistant collects information from interaction with other devices in the house and from the contacts of the mobile device synchronized to the system. Therefore, the user is not always in a position to decide which type of information s/he wants to share with the home assistant: this depends on how the information is gathered. By way of example, the user can decide whether to share the content of the mobile but s/he has no control over data collected as a result of his/her interaction with the system. It is also possible to delete data collected but not to select in advance the information one does not want the system to capture. Furthermore, information is stored in servers that might be located all around the world, and certain companies/developers may not be fully aware of the regulations for transferring data to countries located outside the European Economic Area.

All the above leads to obvious data protection concerns (see, for instance our post Artificial Intelligence vs Data Protection: the main concerns).

Main purposes of the processing and relevance in trials

Data collected can be used for a number of purposes: in addition to providing the service, the home assistant collects data in order to improve the system itself, develop new services, personalize ads. It is generally up to the user to decide how the information should (or should not) be processed.

In addition to the purposes that the developer has envisaged while programming the device, the home assistant can unintentionally collect information that can be useful in a trial. What if the user requests the home assistant to find the most effective poison on the market? That information—that research—would be stored in the cloud. If shared during a trial, it can be used as compelling evidence.

In addition, the home assistant's records may be relevant to ascertain that a person was or was not in a specific place.

There are already a number of trials in which judicial authorities have required the disclosure of home assistants' recordings. If access to the records of a home assistant—owned by the defendant—is pro defendant, it will probably be provided; if, on the contrary, prosecution requires access to the records, the defendant will probably deny it and, therefore, the developing company—as it stores the conversations registered in its servers—will be asked to disclose the records. There are many legal implications to be considered in such an event, and we noted that developing companies generally refused disclosure, unless a valid and binding legal demand is properly served. In this respect, from a data protection perspective, it is worth noting that according to GDPR data can be processed without the data subject's consent if necessary for compliance with a legal obligation to which the controller is subject (Article 6, para. 1 (c).) In addition, there is probably room to argue that, in specific cases, accessing the records is necessary in order to protect the vital interests of the data subject or of another natural person (Article 6, para. 1 (d)) or that is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller (Article 6, para. 1 (e).)

How to ensure that home assistants are used in a privacy safe mode?

As much as, for instance, social networks, home assistants may raise privacy concerns that may be reduced through using cautious behavior. By way of example, when having a home assistant, one can shut down the system if it is not in use. The home assistant should be kept far from children's rooms and other places where minors or mentally disabled people spend their time; in addition, it should not be used in work places in order to avoid, among other things, remote control over the employees and to make sure that confidential information is not disclosed (certainly to the cloud, and possibly elsewhere...). One should never use home assistants to run ambiguous research or to play a joke on a friend by recording his/her conversations. Like many other instruments, most legal concerns will relate to their actual usage.

Dentons is the world's first polycentric global law firm. A top 20 firm on the Acritas 2015 Global Elite Brand Index, the Firm is committed to challenging the status quo in delivering consistent and uncompromising quality and value in new and inventive ways. Driven to provide clients a competitive edge, and connected to the communities where its clients want to do business, Dentons knows that understanding local cultures is crucial to successfully completing a deal, resolving a dispute or solving a business challenge. Now the world's largest law firm, Dentons' global team builds agile, tailored solutions to meet the local, national and global needs of private and public clients of any size in more than 125 locations serving 50-plus countries. www.dentons.com.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.