Evidence of meeting #125 for Canadian Heritage in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was deepfakes.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Heidi Tworek  Associate Professor, University of British Columbia, As an Individual
Monique St. Germain  General Counsel, Canadian Centre for Child Protection Inc.
Shona Moreau  BCL/JD, Faculty of Law, McGill University, As an Individual
Chloe Rourke  BCL/JD, Faculty of Law, McGill University, As an Individual
Signa Daum Shanks  Associate Professor, University of Ottawa, Faculty of Law, As an Individual
Keita Szemok-Uto  Lawyer, As an Individual

4:35 p.m.

BCL/JD, Faculty of Law, McGill University, As an Individual

Shona Moreau

I can't speak for the platforms, and I'm not here to represent them. However, I think it would be beneficial for everyone to ensure that these types of technologies are not readily available.

To preserve their reputation, I think these platforms would want to work on that. We're already seeing them take steps to remove this type of content when they see that there's an issue.

4:35 p.m.

Bloc

Martin Champoux Bloc Drummond, QC

We talk a lot about what we can do and the expectations you have of us as politicians and lawmakers. However, as a parent, what do I tell my daughters?

4:35 p.m.

BCL/JD, Faculty of Law, McGill University, As an Individual

Shona Moreau

Tell them that over the next year, you're going to work to make sure that this will no longer be a problem and that they will not be victimized using these technologies.

4:35 p.m.

Bloc

Martin Champoux Bloc Drummond, QC

What steps can they take to guard against that threat? Is there anything they can do? Of course, they share their photos with their friends. Photographs, information and data are all over the place.

4:35 p.m.

BCL/JD, Faculty of Law, McGill University, As an Individual

Shona Moreau

That's why I'm so afraid of these technologies. You can no longer take photos of yourself to give to your boyfriend or a friend, for example. It's online and you're completely vulnerable, because you don't have the power to take it down.

We need to show leadership on this problem, and it's not just a matter of legislating. We need people to find solutions to quickly remove this type of content from the web.

4:35 p.m.

Bloc

Martin Champoux Bloc Drummond, QC

Thank you very much.

4:35 p.m.

Liberal

The Chair Liberal Hedy Fry

Thank you, Ms. Moreau.

I will now go to the New Democrats and Niki Ashton for six minutes.

4:35 p.m.

NDP

Niki Ashton NDP Churchill—Keewatinook Aski, MB

Thank you very much.

I just saw, Ms. Rourke, that you had your hand up. Do you want to add something? I have some questions, but you can add what you were going to say.

4:35 p.m.

BCL/JD, Faculty of Law, McGill University, As an Individual

Chloe Rourke

That's very kind. I just going to add that you have to see that this technology exists within a societal context of gender-based violence and oppression.

Education and combatting that societal, cultural context is part of the solution. It's not going to fix the technology, but educating in schools to understand the harms so that teenage boys who have access to technology know why it's so harmful is part of the solution. No one single thing will fix it.

4:35 p.m.

NDP

Niki Ashton NDP Churchill—Keewatinook Aski, MB

Thank you.

Speaking of education, I actually wanted to direct my first question to Ms. St. Germain. In part, it's because of the extent to which it's clear we are failing Canadian kids. Looking at the statistics, that's very clear.

We heard of the 15,630 incidents of online sexual offences against children and 45,816 incidents of online child pornography reported by police in Canada from 2014 to 2022. We know that the rate of police-reported online child pornography has almost quadrupled since 2014. You spoke of some of these trends.

Specific to the non-consensual distribution of intimate images, we also see a heartbreaking image emerge. Most people accused of this offence are of a similar age to their victim, and we know they were previously known to the victim. In these situations, it's clear it's kids victimizing kids without necessarily understanding all of the ramifications, both for the victim's future and their own, the perpetrator.

I want to get back to the topic of education that you talked about. How important is education around consent and sexual safety? When it comes to young people, what more can we be doing to teach them how to keep themselves and each other safe?

4:35 p.m.

General Counsel, Canadian Centre for Child Protection Inc.

Monique St. Germain

Education is always a critical component of any policy or initiative that we have. We have lots of different tools in the tool box. We have our criminal law. We have potentially Bill C-63 coming forward to provide some regulatory.... On the education side, obviously it's ensuring that young people are educated about sexual consent and understand the ramifications of sharing intimate images or creating this type of material, etc.

We can't lose sight of the fact that the reason they can do these things is because of the platforms that facilitate this kind of harm. We should have education for parents and for kids, taught through schools and available in a lot of different mediums and the places that kids go. While we can have education, we also need to make sure that we don't lose sight of the fact that there are other actors in the ecosystem who are contributing to the harm that we're all seeing.

4:40 p.m.

NDP

Niki Ashton NDP Churchill—Keewatinook Aski, MB

I believe you did refer to Project Arachnid in your presentation. I'd like to get back to that.

Can you describe the process and work it took to develop this, which I know you referred to? What other tools are you missing that we should be encouraging government to help you with?

4:40 p.m.

General Counsel, Canadian Centre for Child Protection Inc.

Monique St. Germain

Our organization, as I have mentioned, has been operating cybertip.ca for a long time. We are very steeped in technology. Our technological team is very sophisticated in terms of ensuring that the work that we do is done in the most efficient way and that we're not overexposing our staff to things that they don't need to be exposed to.

We leverage technology in a lot of different ways through Project Arachnid. It took a lot of time to develop that system. It's been tweaked over the last several years. It's gotten better and better at doing what it's doing.

What we have now is a very robust source of data that can be relied upon not just by companies but also by governments and other actors. There's a lot of known material out there that human beings have already laid eyes on. It's already classified, and that material can come off of the Internet. However, we need to start using those tools in ways that make sense, instead of overexposing victims to having their imagery repeatedly looked at by different moderators in different countries that have fewer robust safeguards than we have in institutions like our own and in policing, which does this work on a daily basis.

4:40 p.m.

NDP

Niki Ashton NDP Churchill—Keewatinook Aski, MB

I want to get to the big question around stigma that victims face in terms of coming forward. We know that the most recent data provided by Statistics Canada says that in 2022 there were over 2,500 police reports of non-consensual distribution of intimate images. We know, of course, when it comes to sexual assault, that there's a real issue of under-reporting, given victims' lack of trust in the justice system and the policing system, fearing the stigma that they will face.

I'm wondering how we fix this question of stigma in terms of the work that you're involved in and what you're seeing with young people.

4:40 p.m.

Liberal

The Chair Liberal Hedy Fry

You have 45 seconds to do that.

4:40 p.m.

General Counsel, Canadian Centre for Child Protection Inc.

Monique St. Germain

I'm sorry...?

4:40 p.m.

Liberal

The Chair Liberal Hedy Fry

You have 45 seconds to do that.

4:40 p.m.

General Counsel, Canadian Centre for Child Protection Inc.

Monique St. Germain

I've lost track of the question. I'm sorry.

4:40 p.m.

NDP

Niki Ashton NDP Churchill—Keewatinook Aski, MB

No problem. It's on stigma and dealing with the stigma.

4:40 p.m.

General Counsel, Canadian Centre for Child Protection Inc.

Monique St. Germain

Yes, victims are facing a real stigma. These are the victims of non-consensual distribution of intimate imagery and victims of CSAM.

The types of things that these victims will include in their victim impact statements, for example, are very similar. There is a fear of recognition. There is an unwillingness to participate publicly in different ways because they don't want to be identified or linked to sexually abusive material.

We do have a lot of stigma that is going on online. Part of it is that we are allowing all of this material to be up in public view and not doing a lot to get rid of it. We have the big companies doing the things that they do, but even they can't keep in front of it. Then we have all of the smaller websites that we were referring to earlier in this discussion.

There are a lot of issues going on.

4:40 p.m.

Liberal

The Chair Liberal Hedy Fry

Thank you very much, Ms. St. Germain.

I will now go to the second round. It's a five-minute round for the Conservatives.

Kevin Waugh, you have five minutes, please.

4:40 p.m.

Conservative

Kevin Waugh Conservative Saskatoon—Grasswood, SK

Thank you, Madam Chair.

Welcome to everybody here this afternoon.

Mr. Szemok-Uto, you're recently a lawyer. Thank you for that.

As you know, Bill C-63 did not expand to include the Criminal Code. It's not been updated to include deepfakes. That seems to be a concern not only around this table, but everywhere.

I don't know why, when you introduce a bill, you would not.... We've seen that since 2017 deepfakes are accelerating around the world, yet it's not in this bill. What good is the bill when you don't talk about deepfakes?

What are your thoughts?

4:40 p.m.

Lawyer, As an Individual

Keita Szemok-Uto

I would echo your concerns.

I presume there's an issue with the international element that the Internet and deepfakes present. You could have a perpetrator who is in Russia or some small town in a country we don't have much familiarity with. It would be hard to potentially go after those perpetrators.

I did note in my paper that there is inherently a limitation in pursuing criminal prohibition of deepfakes. Again, the standard of beyond reasonable doubt perhaps would play into limiting who is convicted of these crimes, as well as the scope and the resources that would be required to actually provide and enforce criminal prohibitions of this kind of behaviour.

I think that with private law, civil remedies and things that are based on the balance of probabilities, potentially there is a wider scope for not criminal justice but justice of some kind and at least some kind of disincentivization for engaging in this kind of behaviour. That would be my answer.

June 13th, 2024 / 4:45 p.m.

Conservative

Kevin Waugh Conservative Saskatoon—Grasswood, SK

The problem with having an extrajudicial bureaucratic arm, which I think Bill C-63 is, is that it can actually perpetuate harm.

We can see that because victims—some have mentioned it here today—have to come bravely forward and share their stories with a commissioner or a body that they really don't know. That has actually led to no real power. I think the victim in this case is led to hope and then hope is deferred because there are no real teeth in the legislation.

What are your thoughts on that?

I do think Bill C-63 is a bureaucratic nightmare ready to explode if it does get through the House.

4:45 p.m.

Lawyer, As an Individual

Keita Szemok-Uto

I would agree that potentially more could be done than what Bill C-63 presents.

I do note, at least, that there is the inclusion of the word “deepfake” in the legislation. If anything, I think it's a good step forward in trying to tackle and get up to date with the definitions that are being used.

I note that legislation that was recently passed in Pennsylvania is criminal legislation that prohibits the dissemination of an artificially generated sexual depictions of individuals. I won't read the definition out in full. It's quite broad.

Bill C-63 does refer to deepfakes, at least, though no definition is provided. I think in that respect, broadening the terminology.... As I said, the privacy law torts are restricted by their definitions and terminology to not include this new problem that we're trying to deal with. To the extent that it gets the definitions more up to date, I think it is a step in the right direction.