The Government of Canada introduced its long-anticipated "online harms" legislation on February 26, 2024, by tabling Bill C-63 in the House of Commons. Bill C-63, which would enact the Online Harms Act (the "Act"),1 is the Government of Canada's second attempt at legislation to address online harms, arriving nearly three years after its first attempt (then named Bill C-36) died on the order paper when the 2021 federal election was called. The Act aims to "promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act", among other things.

This bulletin summarizes the proposed Act's definition of "harmful content" and the new obligations to be imposed on social media services that will be regulated under the Act. It also provides an overview of the mandate of the proposed new regulatory body, known as the Digital Safety Commission, including its enforcement capabilities, and the range of penalties and sanctions it may impose, as well as its regulatory authority and the implications for regulated social media services.

Regulated Services

The Act will impose several significant duties on the operators of certain "social media services" that are regulated under the Act as "regulated services". The Act defines a social media services as follows:

social media service means a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.‍

The Act clarifies that a social media service includes adult content services and live streaming services that enable users to access and share content by live stream.

A "regulated service" is, in turn, defined as a social media service that either meets a prescribed user threshold (to be set out in regulations) or that the Governor in Council designates as a regulated service where there is "a significant risk that harmful content is accessible on the service."

Regulated Harms

The Act targets seven types of online harms that it deems to be "harmful content":

  • intimate content communicated without consent;
  • content that sexually victimizes a child or revictimizes a survivor;
  • content that induces a child to harm themselves;
  • content used to bully a child;
  • content that foments hatred
  • content that incites violence; and
  • content that incites violent extremism or terrorism.

"Intimate content communicated without consent" means a visual recording where a person is nude, exposes sexual organs or particular regions, or is engaged in explicit sexual activity, where there is a reasonable expectation of privacy, and the person does not consent to the recording being communicated. Such content also includes a visual recording that falsely presents a person in the above manners, including deepfakes, if it is reasonable to suspect that the person does not consent to the recording being communicated.

Notably, the Act does not define hate, but does clarify that content does not incite hatred solely because it "expresses disdain or dislike or it discredits, humiliates, hurts or offends".

Duties Imposed on Regulated Services

The Act imposes a number of duties on regulated services, including:

  • Duty to act responsibly – an operator will be required to implement adequate measures to mitigate the risk of exposure to harmful content for users (including measures to be specified by regulation), to publish user guidelines, to provide tools to block users and flag harmful content, to label certain automated harmful content, to make available a resource person to users with respect to harmful content and the foregoing measures, and to prepare and submit digital safety plans in accordance with the Act;
  • Duty to protect children – an operator will be required to integrate into its regulated service design features respecting the protection of children, which will be determined by regulation; and
  • Duty to make certain content inaccessible – an operator will be required to ensure that content that sexually victimizes a child or revictimizes a survivor, as well as intimate content communicated without consent, is inaccessible to Canadians in certain circumstances within 24 hours (or such other period set by regulation) of discovering such content or such content being flagged.
  • Duty to keep records – an operator will be required to keep all records necessary to determine whether they have complied with the Act.

New Regulatory Bodies and Mandates

Bill C-63 will establish the Digital Safety Commission, a new regulatory body composed of 3-5 full time members with a mandate to:

  • administer and enforce the Act;
  • ensure that regulated service operators are transparent and accountable in their duties under the Act;
  • investigate complaints relating to content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent;
  • contribute to the development of standards with respect to online safety through research and educational activities;
  • facilitate the participation of Indigenous peoples of Canada and interested persons in the Commission's activities; and
  • collaborate with interested persons, including operators, the Commission's international counterparts and other persons having professional, technical or specialized knowledge.

The Commission will have broad powers to ensure operators comply with the Act and to investigate complaints regarding content that sexually victimizes a child or revictimizes and survivor and the communication of intimate images without consent. The Commission would also be able to impose monetary penalties up to a maximum of 6% of the gross global revenue or $10 million, whichever is greater.

The Act would also establish a Digital Safety Ombudsperson of Canada whose role will be to address concerns from, and to provide support to, members of the public regarding harmful content. The Ombudsperson will also serve as an advocate for the public interest in relation to online safety.

The Digital Safety Commission will also oversee the Digital Safety Office of Canada, which will support the Commission as well as the Ombudsperson.

Footnote

1. Bill C-63 would also amend the Criminal Code, the Canadian Human Rights Act, and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.