With the use of AI becoming a more commonplace topic, the question of government regulation has become top of mind.

"The use of AI creates new and more pronounced risks of harm to individuals and groups. An AI-specific statute can help prevent these harms from materializing," senior associate Nic Wall said in an interview with Best Lawyers.

"While the common law has shown some ability to adapt to new technology, it often only does so slowly and retrospectively. There is also a benefit to the industry in having concrete legislative parameters instead of hoping their adherence to best practices satisfy various regulators or withstand scrutiny in litigation."

As of June 2023, the Canadian government introduced Bill C-27, the Digital Charter Implementation Act which includes the country's first attempt at AI regulation called the Artificial Intelligence and Data Act (AIDA).

"AIDA seems well-intentioned and is a good starting point, but needs some work," Nic said.

"Stakeholders from all sides have been pushing for amendments and clarifications. Industry is requesting certainty and clarity on the scope of application, the meaning of key definitions and who is responsible for complying with the AIDA's substantive requirements."

Nic noted the importance of creating a statute that isn't overly prescriptive and inflexible and that could risk innovation. He acknowledged that achieving the right balance of certainty and flexibility will no doubt be challenging.

"Ultimately, whether the legislation actually achieves its stated goals, and whether it does so in a way that avoids stifling AI innovation or businesses more generally, will depend a lot on its execution," he said.

You can read more about our Data Governance and Strategy work on our practice page.