On February 24, 2024, the Canadian government brought its long-awaited Online Harms Act (Bill C-63) to Parliament. This substantive and expansive legislation is intended to reduce users' exposure to harmful online content and to make certain online service providers more transparent and accountable as to how they deal with this content.

Since its tabling in the House of Commons, there has been a steady stream of commentary and debate about what this proposed law will mean for victims of abuse, proponents of freedom of speech, impressionable young people, and marginalized groups frequently targeted by hate speech.

In this blog post, I would like to focus on what this bill means for victims of "revenge porn." Although publishing intimate content without consent online is already illegal, this bill puts the onus on social media companies, live streaming services, and user-uploaded adult content services to take down, review, and dispose of or securely store content flagged as designed to harm a person's dignity and privacy.

What Online Harms Does the Bill Target?

The Online Harms Act defines harmful content as being:

  • intimate content communicated without consent;
  • content that sexually victimizes a child or re-victimizes a survivor;
  • content that induces a child to harm themselves;
  • content used to bully a child;
  • content that foments hatred; content that incites violence; and
  • content that incites violent extremism or terrorism.‍

According to the bill's text, "intimate content communicated without consent" can mean:

  • a visual recording such as a photographic, film or video recording, in which a person is nude or is exposing their sexual organs or anal region or is engaged in explicit sexual activity, if it is reasonable to suspect that the person had a reasonable expectation of privacy at the time of the recording, and the person does not consent to the recording being communicated; or,
  • a "deepfake" visual recording, such as a photographic, film or video recording, that falsely presents in a reasonably convincing manner a person as being nude or exposing their sexual organs or anal region or engaged in explicit sexual activity, including a deepfake that presents a person in that manner, if it is reasonable to suspect that the person does not consent to the recording being communicated.‍

How Does This Bill Target Intimate Content Communicated Without Consent?

Much of the harmful content this bill targets is already illegal to some degree. For instance, the Criminal Code considers voyeurism (section 162), obscene publication (section 163), criminal harassment (section 264), extortion (section 346) and defamatory libel (sections 298-300) to be unlawful acts.

However, a working group of senior officials mandated by federal, provincial, and territorial ministers responsible for Justice and Public Safety to identify potential gaps in the Criminal Code on cyberbullying and the non-consensual distribution of intimate images has noted that:

"In relation to adults, there are concerns relating to the ability of the criminal law to respond to [non-consensual distribution of intimate images], absent additional aggravating features that may bring the conduct at issue within the scope of existing offences... For example, the offence of voyeurism only applies if the image is taken surreptitiously, and in the situation at issue, the images are most often taken with the consent of the person depicted. The offence of obscene publication would only apply if the image depicted was one of violence and sex, which is not a typical situation.Criminal harassment requires that the victim actually fear for their safety or the safety of someone known to them. The result of this type of conduct is usually embarrassment or humiliation caused by the breach of privacy, but not necessarily a fear for one's safety. Although existing criminal offences may apply in certain situations, they do not address the identified harm and therefore are not adequately responsive to the non-consensual distribution of intimate images."


In this respect, the proposed Online Harms Act's provisions are best conceived of as another tool in the toolbox victims of "revenge porn" can use to take action against their abuser. The legislation not only puts the onus on the online service providers to act responsibly and develop measures to mitigate the risk that users of the service will be exposed to harmful content, but also requires them to act when alerted to harmful content.

The duties this legislation imposes on operators of regulated services include:

  • The duty to act responsibly.
  • The duty to implement certain measures (tools to block users from finding or communicating with other users, tools to flag harmful content, a dedicated resource person to hear concerns and direct users to a complaints process).
  • The duty to make certain content inaccessible once it is identified or flagged.
  • The duty to preserve certain harmful content for a period of one year beginning on the day on which the content is made inaccessible. The duty to keep records involving identified/flagged content.
  • The duty to destroy this content (and other related computer data and records) as soon as feasible after the one-year period, unless the operator is required to preserve the content, data or document under a judicial order.

The legislation also proposes to create a new digital safety commission to help enforce compliance to the Act and ombudsperson to assist users of these services and promote online safety and harm-reduction.

Penalties for non-compliance on the part of the service providers could bring about a maximum penalty of six per cent of gross global revenue, or $10 million, whichever is greater.

How Effective Will the Online Harms Act Be?

If this legislation is passed by Parliament, Canada will join jurisdictions such as the United Kingdom, the European Union and the United States of America in implementing laws and regulations which put more onus on online social media, streaming, and adult content service providers to take responsibility for harmful user-uploaded content.

Whether the Online Harms Act will be effective in reducing the impact and spread of harmful online content remains to be seen. Overall, I find this legislation to be positive and promising. It allows victims of "revenge porn" and artificial intelligence-generated "deepfakes" a way to limit exposure of material designed to hurt, humiliate, and violate privacy.

Some members of the legal community have raised concerns about definitional clarity. In reading this bill, for instance, I was struck by the use of the word "communicate" when "publish" has been employed so effectively in other laws and is well understood within case law.

Other commentators have suggested this bill may be effective for the public-facing parts of online platforms where the audience for harmful content is potentially unlimited, but contend that it does not tackle distribution of content among smaller audiences where perpetrators of certain harms, such as sextortion, thrive.

For example, this Act does not apply to any private messaging feature on an otherwise regulated service.

Private messaging is defined as a feature that enables a user to communicate content to a limited number of users determined by the user; and does not enable a user to communicate content to a potentially unlimited number of users not determined by the user. In essence, if a user uses private messaging to distribute harmful content to one or more other users, but not a potentially unlimited number of users, the duties imposed by the Act would not apply to the operators of the online platform.

Sharing "revenge porn" or "deepfakes" in a private message among people who may know the intended victim personally or professionally is a real concern, and while a person harmed by such an act may be able to take action against the person sharing this content through other legal means, an explicit process to flag such content, preserve it for potential judicial action, and make it (at least temporarily) inaccessible to users who have not yet seen the message would significantly reduce a victim's exposure to this harm.

Similarly, if content has been identified as harmful through the public-facing portion of an online platform, legislation that would compel operators from using technological means to employ a digital fingerprint that prevents it from being shared privately would also potentially minimize a victim's exposure to further harm.

While there are obvious and legitimate concerns about what this kind of reach could mean in terms of a user's or group of user's privacy in these private messaging applications, limiting the spread of specific forms of harmful content (video recording and images) while preserving privacy for other forms of communication (text and voice messages) may be the right balance here.

However, the law is generally more reactive in nature rather than preventative.

Reducing the risk of online harms is a laudable goal, but providing remedies for the aggrieved is something that can act as a preventative measure in itself.

If large social media companies, social media platforms, streaming platforms, and user-uploaded adult content sites are concerned by the significant monetary penalties this bill imposes, or find themselves inundated and overwhelmed by the numbers of 'flags' they must respond to, it may provide the impetus for them to develop their own technological tools and practices that reduce users' ability to post potentially harmful content in the first place.

We're Here to Help.

As an experienced sexual abuse and sexual assault lawyer, I know very well how "revenge porn" and "deepfakes" spread online can hurt a person whose privacy has been violated, who has escaped from an abusive relationship, or who has been targeted by one or more people seeking to embarrass or discredit them for any number of reasons.

Helping my clients go after the people who have and/or are still trying to harm them is what drives my tireless work ethic, and watching survivors take back a sense of control over their lives inspires me to continue my fierce advocacy.

If you or a loved one has experienced sexual abuse, sexual assault, childhood sexual abuse, or harm from unlawful online content, I encourage you to contact me for an initial no obligation free consultation. With great empathy and compassion, I will listen to your story, clearly explain your rights and options and offer to help in any way I can.

Survivors of abuse who take action against their abusers are not only often helping give themselves a sense of justice and closure, but also ensuring other people will not have to experience this terrible trauma. Trust the sexual abuse lawyers at Jellinek Ellis Gluckstein Lawyers to be on your side and by your side as you retake control and decide what is the best path forward for you.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.