Opinion: Online Harms Act only one step in preventing damage of deepfakes

Victims, most of whom are women, need better judicial recourse to hold perpetrators accountable in Canada.

Gendered violence: another year, another way to perpetuate the cycle. What is the latest trend? The creation of non-consensual deepfakes.

Canada needs better legal recourse for victims of non-consensual deepfakes to hold perpetrators accountable.

Victims of deepfakes also experience constant anxiety about who has viewed this content or when they might see it next. Even if the content is taken down, it may already have been shared or saved to personal devices.

While this is a step in the right direction, we are still not holding perpetrators accountable.

The Online Harms Act did not introduce any changes to the Criminal Code. However, the quality of deepfakes has improved, making it difficult to discern from non-digitally altered pictures, and victims are facing the same consequences as with NCDII. Non-consensual deepfake of a sexual nature should carry the same judicial consequences as with NCDII. We could amend our Criminal Code to criminalize non-consensual deepfakes of a sexual nature.

It would be ideal to implement a new tort that recognizes deepfakes as a social and ethical wrong. This tort could be applied when a defendant distributes non-consensual deepfakes of the plaintiff. A perpetrator should not be able to use the defence that they used media voluntarily uploaded online by the plaintiff. The issue at stake is how these once-consensual images are being used.

The only way to avoid becoming a victim of deepfakes is to limit your online presence. Not having an online presence can be disadvantageous in today’s reality. It would also place the responsibility on victims, mainly women, instead of condemning perpetrators.

Because the content that is used to create deepfakes has likely been uploaded online consensually, creators of deepfakes might believe that they’re not doing anything wrong, or they simply don’t care. In reality, this type of content is harming women and is just a new way of perpetuating gendered violence.

While, ideally, it would be best to stop the creation and sharing of deepfake before it happens, it can be very difficult, if not impossible. Victims need access to better judicial recourse to hold perpetrators accountable.

Katheryne Soucy is a J.D. candidate (2025) at the University of Ottawa.

Related Posts


This will close in 0 seconds