Evidence of meeting #125 for Canadian Heritage in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was deepfakes.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Heidi Tworek  Associate Professor, University of British Columbia, As an Individual
Monique St. Germain  General Counsel, Canadian Centre for Child Protection Inc.
Shona Moreau  BCL/JD, Faculty of Law, McGill University, As an Individual
Chloe Rourke  BCL/JD, Faculty of Law, McGill University, As an Individual
Signa Daum Shanks  Associate Professor, University of Ottawa, Faculty of Law, As an Individual
Keita Szemok-Uto  Lawyer, As an Individual

June 13th, 2024 / 5 p.m.

Conservative

Jacques Gourde Conservative Lévis—Lotbinière, QC

I'm a grandfather and I have seven grandchildren. I get the chills when I realize that we can't control artificial intelligence and that producers are using artificial intelligence applications to create content that can truly destroy the lives of the most precious thing we have. I'm talking about our youth, who represent our future.

Can we create tools using the same weapon, artificial intelligence, to tackle this problem? People can't keep watch 24 hours a day. We'll have to find a solution.

It might be a good idea to pass legislation to punish people who promote all this.

That said, when a person realizes that their children or grandchildren are on one of these sites, it takes so long to have the content removed that it might remain accessible for quite some time.

Can we come up with tools to wipe it off the face of the earth?

5 p.m.

BCL/JD, Faculty of Law, McGill University, As an Individual

Chloe Rourke

We're not AI programmers, but I think it's definitely possible to have that, and I think that's one of the reasons that public regulatory bodies ought to work with platforms. They would understand how these systems work and how they could be used to help regulate content. Platforms already do content moderation, any of the major tech platforms like YouTube, Google, Facebook, etc.

It's a new wave of content that they'll have to account for in their current systems.

5:05 p.m.

Conservative

Jacques Gourde Conservative Lévis—Lotbinière, QC

If you had a magic wand to help us with this study so we could really change things, what would you tell us to do?

5:05 p.m.

BCL/JD, Faculty of Law, McGill University, As an Individual

Chloe Rourke

I keep going back to this, but I really do think that there needs to be a relationship built with platforms and an accountability for the platforms.

What's so shocking to me is really just how accessible this technology is. As long as it remains that accessible, it's inevitable that there will be more people harmed.

One thing that I really felt frustrated about when I was looking at it was the way in which the technology is presented. It's in a very neutral and a gender-neutral way, almost as if it's not deliberately intended to do harm and to create non-consensual pornography. Here's one example of the way these platforms describe themselves: “sophisticated AI algorithms to transform images of clothed individuals”. They have a disclaimer that the AI should comply with the law and be consensual, when clearly that's not how they're designed to work, and there are no control mechanisms on there to ensure that any of the images being used are being used in a consensual manner.

I think that, as long as it is that accessible and not challenged, we're going to continue to see these harms. I would start there.

5:05 p.m.

Liberal

The Chair Liberal Hedy Fry

Thank you very much, Mr. Gourde.

I now go to Rachel Bendayan for the Liberals for five minutes.

5:05 p.m.

Liberal

Rachel Bendayan Liberal Outremont, QC

Thank you very much, Madam Chair, and thank you to my colleagues for having me. I don't usually sit on this committee, but I'm pleased to be with you today.

I'd like to begin by saying that I quite agree with the spirit of the amendment moved by Mr. Champoux. I hope the committee will look at that.

Ms. Moreau, I believe you said that trials are long and costly. I used to be a litigator, and I quite agree with you on that.

That said, I just want to make sure I understand your comments.

You say that the time it takes to get a judgment is a little too long.

Is that correct?

5:05 p.m.

BCL/JD, Faculty of Law, McGill University, As an Individual

Shona Moreau

That's correct.

First, I'd like to make a small clarification. I don't have the title “master” yet. I have six months to go before I can get it.

With respect to your question, I think that's correct, yes. It's not really realistic to think that every single person who has a deepfake photo circulating somewhere on the Internet is going to go through the whole process of challenging it in court. So much content causes problems and that really has an impact on people, so we need to find solutions that are a little more practical.

5:05 p.m.

Liberal

Rachel Bendayan Liberal Outremont, QC

To follow up on Mr. Gourde's comments, I'd like to ask you the following question.

I know you're not a technician, but wouldn't it be interesting, if not preferable, if platforms could quickly recognize deepfakes, even after they've been posted online?

I don't know if that technology exists yet. If not, I hope it will be developed soon.

5:05 p.m.

BCL/JD, Faculty of Law, McGill University, As an Individual

Shona Moreau

If that's possible, then platforms should do it. Platforms are already testing AI systems to perform content moderation. I think they are already looking at nudity.

5:05 p.m.

Liberal

Rachel Bendayan Liberal Outremont, QC

Thank you.

Professor Daum Shanks, allow me—taking off my lawyer hat and maybe putting on my mother hat—to ask you how the online harms legislation that we put forward will empower users to flag harmful content.

I am worried that we are literally asking parents to police the Internet, and I would like to hear from you on the tools in the legislation that might help in that regard.

5:05 p.m.

Associate Professor, University of Ottawa, Faculty of Law, As an Individual

Dr. Signa Daum Shanks

I think that's a great question. I'm going to keep the lawyer's hat on, though, in answering it.

5:05 p.m.

Liberal

Rachel Bendayan Liberal Outremont, QC

Yes, please.

5:05 p.m.

Associate Professor, University of Ottawa, Faculty of Law, As an Individual

Dr. Signa Daum Shanks

One of the things I would like to be put forward in whatever happens...and that's the way I'm going to put it: “whatever happens”.

I might say this too because of your professional work cap, but I really appreciate the idea of having some functions that are very similar to what have been trends in administrative law. The thing I'm the most concerned with is that people feel like they can be themselves—people who have less access to legal counsel and people who are working with whatever commissioner ombuds office is functioning—and that they can also have the space that administrative law has often had to think of some spontaneous adjustments to procedure, which means everything from having translators who only speak English and French, for example, to having things like literally a comfortable chair.

I think Cindy Blackstock has brought up many topics in what she has done on thinking of the long list of little things that can make people feel more safe. That's probably my biggest bailiwick—to think of however there is a moment when we think someone can call an official space, whether it's a toll-free number or filing a written report or something, that they feel like the support system is right there.

I think in criminal law functions, as someone who's worked a lot for the Crown in the past and then connected with law firms, that first stage of getting things going is incredibly intimidating to people not trained in law. I want to find as many ways to avoid that as possible, and I think we have every obligation to provide that. I did mention earlier that idea of fiduciary obligations, which I think is one of the best ways to help us imagine those ways, because—

5:10 p.m.

Liberal

The Chair Liberal Hedy Fry

Thank you, Professor Daum Shanks. I think we're well over that time.

5:10 p.m.

Associate Professor, University of Ottawa, Faculty of Law, As an Individual

5:10 p.m.

Liberal

The Chair Liberal Hedy Fry

Thank you very much.

I'm going to go now to the next round, but I don't think we can finish a full round, guys. I'm going to do two five-minute rounds, and then the Bloc and the NDP get two and a half minutes each. We have 15 minutes left, and this will bring us to exactly 5:30. Thank you.

I start with Philip Lawrence for the Conservatives.

You have five minutes, Philip.

5:10 p.m.

Conservative

Philip Lawrence Conservative Northumberland—Peterborough South, ON

Thank you, Madam Chair.

I want to go to you first, Ms. St. Germain. I just wanted to, first of all, confirm that I heard what I thought I heard in earlier testimony. You said your organization currently has flagged, I don't think you said “the quantum” but, I would assume, a large amount of material that you've already classified as non-consensual or otherwise inappropriate. Did I get that correctly?

Is she still there? No, she's gone. Okay, I apologize.

I'll go with that line of questioning to you, Ms. Rourke, and you, Ms. Moreau.

My challenge with the legislation is that, while directionally I agree with it, I don't think it's going to help enough people quickly enough. We already have the Internet awash with non-consensual pornography and child pornography. With this legislation, someone has to complain about it, they have to go to a commissioner and it has to get removed.

I see opportunities to improve this legislation by including automatic removal of deepfakes or non-consensual pornography that's already been flagged and tagged. Do you think there's an opportunity there or no?

5:10 p.m.

BCL/JD, Faculty of Law, McGill University, As an Individual

Shona Moreau

I think I wouldn't be prescriptive on an amendment for this specific bill, but I do think there should be work done around that. If it's a future bill, a report or even just working with regulators in order to be able to do something like that, I think that would be really great. Yes.

Thank you.

5:10 p.m.

Conservative

Philip Lawrence Conservative Northumberland—Peterborough South, ON

Continuing along that line, there are a number of different levels that we could put enforcement on. One is the creator of illegal pornography or illegal deepfakes. There are also the platforms, and then there are also the devices, the computers, the laptops and the phones.

The bill doesn't capture a lot of that. I don't think, just because we can't necessarily enforce all the mice, we should then not enforce the elephants. I believe that we should consider going after the platforms directly.

Would you agree with that?

5:15 p.m.

BCL/JD, Faculty of Law, McGill University, As an Individual

Shona Moreau

There's an expression in French that says, le mieux est l'ennemi du bien.

I think that having work done on this issue currently is great. Technology is not going away. AI is not going away, and more work is going to be coming down the pipeline. Hopefully, you as legislators will be able to do that work quickly enough to catch up.

I think it also makes us understand that, when we're making legislation now, we have to be looking five to 10 years or even sometimes 25 years out. We can't just be legislating and working on current issues. We almost have to be working on future issues.

5:15 p.m.

Conservative

Philip Lawrence Conservative Northumberland—Peterborough South, ON

I completely agree.

What I haven't heard much testimony on—and I would be curious if any of the witnesses have something to say—is the device level, because the manufacturer, almost by definition, has to be an elephant. There are a limited number of companies that manufacture smart phones and manufacture computers.

Is that something that anyone's come across in their research as a possible way of reducing online harms?

5:15 p.m.

BCL/JD, Faculty of Law, McGill University, As an Individual

Chloe Rourke

The difference between manufacturing a computer or a phone versus AI technology is that, even if the initial software is developed by a larger company, it's now being made open source. Once it's open source, that code can be modified or appropriated, and there's really no way to take it back out of the public domain once it exists there.

It's a difficult question whenever you're talking about innovation in the context of innovating technologies, and there are multiple different applications. What are the ethics of that and how do you balance that with what is essentially a huge economic incentive to invest in AI technology and its development and to have the ability to experiment with that technology and see where it grows, but still acknowledge that a technology like this has a lot of harmful applications?

As far as we can see, it's a fairly limited commercial application like in film, media and these kinds of areas, but it can clearly be manipulated, not just for sexually explicit material but for many other purposes as well.

5:15 p.m.

Conservative

Philip Lawrence Conservative Northumberland—Peterborough South, ON

I believe that's my time.

I'm just going to thank you for all your great work. I appreciate it.

5:15 p.m.

Liberal

The Chair Liberal Hedy Fry

Thank you. That's it.

I now go to Ms. Lattanzio for the Liberals.

Patricia, you have five minutes.

5:15 p.m.

Liberal

Patricia Lattanzio Liberal Saint-Léonard—Saint-Michel, QC

Thank you, Madam Chair.

My question will be for Professor Daum Shanks.

Professor, are you familiar with the proposed changes in Bill C-63 with regard to the mandatory reporting act?