On February 26, 2024, Bill C-63, the Act to enact the Online Harms Act, hereinafter the ("Bill"), went to first reading in the House of Commons of the Parliament of Canada.1

This post deals with the purpose and scope of the Bill, the changes to be made to the current regime, and their impact on the private businesses that will be required to implement them.

In proposing amendments to the Criminal Code, the Canadian Human Rights Act and the Act respecting the mandatory reporting of Internet child pornography by persons providing Internet services and other related legislation, the Bill aims to strengthen the protection of children online, better protect Canadians from hate propaganda and other types of harmful content online, and ensure that operators of social media services subject to the legislation are transparent and accountable for their duties under the legislation.2

As it has for some years now, the responsibility for online safety currently lies with users of the services or their parents. Online platforms and live streaming services have no specific duty to monitor their users or take action, except when an incident or potential harm is reported to them. Under Quebec law, a platform manager has no obligation to monitor the information contained on its site or to determine that it is not being used for illicit activities,3 as provided under section 27 of the Act to establish a legal framework for information technology.4 Similarly, a service provider is not responsible for the content that platform users focus on.5 An intermediary may nonetheless be held responsible when it is aware that material hosted on its platform is illicit in nature and refuses to remove it.6

Object of the bill: harmful content

Bill C-63 would therefore impose heightened responsibility and transparency requirements on social media operators across the country, notably by imposing a duty to take action, protect children, make harmful content inaccessible, and keep the necessary records. With this responsibility, social media operators and distribution services would need to establish specific measures to reduce the risk arising from seven types of harmful content, namely:

  • Content that sexually victimizes a child or revictimizes a survivor;
  • Intimate content communicated without consent;
  • Content that foments hatred;
  • Content that incites violent extremism or terrorism;
  • Content that incites violence;
  • Content used to bully a child;
  • Content that induces a child to harm themselves.7

Duties of social media operators and distribution services

Part 1 of the Bill sets out three fundamental duties for social media operators and distribution services: (1) a duty to act responsibly, (2) a duty to protect children, and (3) a duty to make certain content inaccessible. These duties are broken down into a number of different requirements, as detailed below.

If the Bill is passed, social media operators and distribution services will be required to assess the risks of exposure to harmful content that their platforms present and adopt appropriate and effective measures to reduce that risk.8 They will also need to provide their users with guidance and tools for reporting harmful content and blocking other users.9 In addition, platforms will be required to have a service to receive user complaints or provide advice about online harm and to designate a contact person for this purpose.10 In the event that harmful content is repeatedly published or artificially amplified through automated communications, it would have to be labelled as such.11 This requirement would apply to content disseminated on a large scale by robots or botnets, for example. Lastly, platforms will be required to submit to the Digital Safety Commission of Canada digital safety plans detailing the measures taken, their effectiveness, the indicators used to assess them, as well as any analysis of new risks or trends in online safety, and to make this information publicly available.12 Platforms will also be required to identify the datasets used to ensure digital safety, store them and, if applicable, share them with qualified researchers.13

With regard to the duty to protect children,14 the businesses concerned will be obliged to ensure that the design features include age-appropriate safety measures geared to children.15 The services in question would be in line with the measures and guidelines respecting the protection of children established by the Commission. Some examples of measures that could be imposed on them are default settings for parental controls, warning labels for children, and safe searching within the service in question. The Commission could also require that they incorporate means of limiting children's exposure to harmful content, including explicit sexual content, bullying content, or content that encourages self-harm.16

Regarding the duty to restrict access to certain content, the Bill calls for social media platforms and distribution services to block their users' access to the two types of content considered most harmful:

  • Content that sexually victimizes a child or revictimizes a survivor; and
  • Intimate content communicated without consent, including a deepfake of a sexual nature.

The content in question will need to be made inaccessible within 24 hours or within any other period prescribed by regulation.17 If this type of content is flagged, before making it inaccessible, operators will first need to assess whether the flag is trivial, frivolous, vexatious or made in bad faith and verify whether the content that is the subject of the flag is or has already been the subject of another flag and that further assessment of the flag is unnecessary having regard to all of the circumstances.18

Two new structures for implementing the Bill and ensuring that the rights and duties set out therein are respected

The Bill provides for the creation of two bodies to oversee the rights and duties of the businesses concerned:

  • Creation of the Digital Safety Commission of Canada

The Bill provides for the creation of a commission with a mandate to oversee and enforce the regulatory framework of the Online Harms Act. Its powers would include receiving and managing user complaints about breaches of duties by platforms and services, having certain harmful content removed, and helping society become more resilient to online harm by developing new safety standards and providing educational resources to limit risk.19

  • Creation of a digital safety ombudsman's office

Also planned is the creation of a new digital safety ombudsman's office to support and defend the public interest in systemic online safety issues. Appointed for a five-year term, the ombudsman would be a resource person for users as well as victims. The ombudsman's role would include consulting with users, issuing calls for contributions on specific topics, and highlighting significant or systemic issues relating to online harm with the entities concerned.20

Legislative amendments arising from Bill C-63

  • Adding hate crime offences to the Criminal Code

The Bill also creates a new definition of "hate crime" and "hatred" in the Criminal Code to clarify which people in Canada could fall under these offences. The amendments to the Code also include considering any offence under the Criminal Code or any other Act of Parliament to be a hate crime when the underlying act was motivated by hatred based on race, national or ethnic origin, language, colour, religion, sex, age, mental or physical disability, sexual orientation, or gender identity or expression. The offence would carry a maximum penalty of life in prison, and hate crimes could then be charged and prosecuted internationally. The Bill would also increase maximum penalties for hate propaganda offences.21.

  • Amendment to the Canadian Human Rights Act

The Bill would amend the Canadian Human Rights Act to make the communication of hate speech a discriminatory practice. The Digital Safety Commission of Canada would be responsible for handling the corresponding complaints.22

  • Amendment to the Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service

The Bill would also introduce changes to enhance law enforcement's response to crimes involving the sexual exploitation of children. This would be done by creating a regulatory body responsible for enforcement and would entail clarifying the Internet services concerned, simplifying the notification process, extending data retention and limitation periods, and adding regulatory powers23.

Proposed penalties

Lastly, the Bill creates a system of administrative monetary penalties that could be applied to social media operators and service platforms that fail to perform their duties under the legislation. The maximum penalty for a violation would be 6% of the gross global revenue of the person believed to have committed the violation or $10 million, whichever is greater.24 The factors to be taken into account in imposing the penalty would include the nature and scope of the violation, the history of compliance with the legislation by the person believed to have committed the violation, any benefit that the person obtained by committing the violation, their ability to pay the penalty and the likely effect of paying it on their ability to carry on their business, and the purpose of the penalty.25

Other penalties relating to hate crime offences are also set out in the Criminal Code.26

Key Takeaways

Bill C-6327 introduces several changes to the duties of social media operators and direct content delivery services in Canada to monitor online harm. Not only would the businesses concerned have to take steps to protect their users and limit their access to the various types of harmful content, but they would also be required to report what they see to the Commission.

Since the Bill is at first reading stage, it may well be subject to various amendments before being adopted. From the Government of Canada's news releases, it is nonetheless clear that the requirements to which operators would be subject in relation to online safety and responsibility will be quite high in the future, forcing many businesses to change their business processes.

We will continue to monitor the Bill's progress and make you aware of any important updates. We are here to give you the help and advice you need as you think about implementing the necessary changes within your business to ensure compliance with Canada's online safety requirements.

Footnotes

1. Government of Canada, News release, "Government of Canada introduces legislation to combat harmful content online, including the sexual exploitation of children" (February 26, 2024), online: < https://www.canada.ca/en/canadian-heritage/news/2024/02/government-of-canada-introduces-legislation-to-combat-harmful-content-online-including-the-sexual-exploitation-of-children.html >

2. Bill C-63, An Act to enact the Online Harms Act, First Session, 44th Parliament, 2024, (first reading February 26, 2024).

3. Lehouillier-Dumas c. Facebook inc., 2021 QCCS 3524

4. Act to establish a legal framework for information technology, CQLR, c C-1.1 ("ALFIT"), s. 27.

5. ALFIT, s. 22

6. ALFIT, s. 22, para. 2

7. Backgrounder – Government of Canada introduces legislation to combat harmful content online, including the sexual exploitation of children (February 26, 2024), online: < https://www.canada.ca/en/canadian-heritage/news/2024/02/backgrounder--government-of-canada-introduces-legislation-to-combat-harmful-content-online-including-the-sexual-exploitation-of-children.html >

8. Supra, note 2, ss. 54 and 55.

9. Supra, note 2, s. 58.

10. Supra, note 2, s. 61.

11. Supra, note 2, s. 60.

12. Supra, note 2, s. 62.

13. Supra, note 2, s. 63.

14. Supra, note 2, s. 64.

15. Supra, note 2, s. 65.

16. Supra, note 2, s. 66.

Supra, note 4.

17. Supra, note 2, s. 67.

18. Supra, note 2, s. 68

19. Supra, note 2, ss. 10-11.

Supra, note 4.

20. Supra, note 2, ss. 29-31.

21. Supra, note 2, Part 2.

Supra, note 4.

22.Supra, note 2, Part 3.

Supra, note 4.

23. Supra, note 4.

24. Supra, note 2, s. 101.

25. Supra, note 2, s. 102.

26. Supra, note 4.

27. Supra, note 2.

To view the original article click here

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.