An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of June 7, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Diversity and InclusionOral Questions

June 18th, 2024 / 2:30 p.m.
See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

Mr. Speaker, I welcome the Leader of the Opposition recognizing the divisive rhetoric and the division that is occurring in Canadian society right now. We have a problem with hatred. We have to address that problem. We know that the statistics show that hate crimes are on the rise 130% in the last five years.

That is why I was proud to stand with CIJA when we tabled Bill C-63, the online harms legislation that would improve penalties for hate crimes, provide a definition of hatred and ensure that we are keeping Canadian communities safe. The special envoy on anti-Semitism supports the bill. CIJA supports the bill. I am just wondering why the Leader of the Opposition does not.

June 13th, 2024 / 5:20 p.m.
See context

Liberal

Patricia Lattanzio Liberal Saint-Léonard—Saint-Michel, QC

In your introduction, Professor Daum Shanks, you spoke about the difficulty for victims to get specific harms recognized as such, as they're not covered by the existing jurisprudence. I'm curious to know how the digital safety commission and the digital safety ombudsperson proposed in Bill C-63 would help us better support victims and make sure they have their voices heard.

June 13th, 2024 / 5:15 p.m.
See context

Liberal

Patricia Lattanzio Liberal Saint-Léonard—Saint-Michel, QC

Thank you, Madam Chair.

My question will be for Professor Daum Shanks.

Professor, are you familiar with the proposed changes in Bill C-63 with regard to the mandatory reporting act?

June 13th, 2024 / 4:50 p.m.
See context

Liberal

Patricia Lattanzio Liberal Saint-Léonard—Saint-Michel, QC

Thank you, Professor Daum Shanks.

In your presentation, you mentioned making the two observations. You just discussed the first one. I'm going to go with the responsibility.

How is Bill C-63 addressing the notion of responsibility?

June 13th, 2024 / 4:45 p.m.
See context

Liberal

Taleeb Noormohamed Liberal Vancouver Granville, BC

Thank you so much to the witnesses for being here.

I'm going to split my time with my colleague, Ms. Lattanzio.

One of the things that we have sought to do in addressing some of the concerns related to online harms, and in particular some of the issues that have been raised during the course of the study, are making sure that our legislation, Bill C-63, the online harms bill, takes on some of these challenges head-on and works.... As we have said, we are willing to work with all parties to ensure that it's the best possible bill out there.

I don't know if you had a chance to follow the deliberations of our meeting on Tuesday, but our colleague, Mrs. Thomas, raised what I would argue is a very important concern in a number of her questions. It would appear, at least from my read of it, that she was advocating—I don't want to put words in her mouth, but this is the way that I understood what she was suggesting—that we take a risk-based approach in the legislating of regulations around social media platforms. Our belief is that this is exactly what Bill C-63 proposes to do.

Do you agree, first of all, with Bill C-63 and what we're trying to do, and that the right approach to legislating regulations around social media platforms is really to take a risk-based approach, as suggested by Mrs. Thomas and others?

I would refer that question to you, Professor.

June 13th, 2024 / 4:45 p.m.
See context

Lawyer, As an Individual

Keita Szemok-Uto

I would agree that potentially more could be done than what Bill C-63 presents.

I do note, at least, that there is the inclusion of the word “deepfake” in the legislation. If anything, I think it's a good step forward in trying to tackle and get up to date with the definitions that are being used.

I note that legislation that was recently passed in Pennsylvania is criminal legislation that prohibits the dissemination of an artificially generated sexual depictions of individuals. I won't read the definition out in full. It's quite broad.

Bill C-63 does refer to deepfakes, at least, though no definition is provided. I think in that respect, broadening the terminology.... As I said, the privacy law torts are restricted by their definitions and terminology to not include this new problem that we're trying to deal with. To the extent that it gets the definitions more up to date, I think it is a step in the right direction.

June 13th, 2024 / 4:45 p.m.
See context

Conservative

Kevin Waugh Conservative Saskatoon—Grasswood, SK

The problem with having an extrajudicial bureaucratic arm, which I think Bill C-63 is, is that it can actually perpetuate harm.

We can see that because victims—some have mentioned it here today—have to come bravely forward and share their stories with a commissioner or a body that they really don't know. That has actually led to no real power. I think the victim in this case is led to hope and then hope is deferred because there are no real teeth in the legislation.

What are your thoughts on that?

I do think Bill C-63 is a bureaucratic nightmare ready to explode if it does get through the House.

June 13th, 2024 / 4:40 p.m.
See context

Conservative

Kevin Waugh Conservative Saskatoon—Grasswood, SK

Thank you, Madam Chair.

Welcome to everybody here this afternoon.

Mr. Szemok-Uto, you're recently a lawyer. Thank you for that.

As you know, Bill C-63 did not expand to include the Criminal Code. It's not been updated to include deepfakes. That seems to be a concern not only around this table, but everywhere.

I don't know why, when you introduce a bill, you would not.... We've seen that since 2017 deepfakes are accelerating around the world, yet it's not in this bill. What good is the bill when you don't talk about deepfakes?

What are your thoughts?

June 13th, 2024 / 4:35 p.m.
See context

General Counsel, Canadian Centre for Child Protection Inc.

Monique St. Germain

Education is always a critical component of any policy or initiative that we have. We have lots of different tools in the tool box. We have our criminal law. We have potentially Bill C-63 coming forward to provide some regulatory.... On the education side, obviously it's ensuring that young people are educated about sexual consent and understand the ramifications of sharing intimate images or creating this type of material, etc.

We can't lose sight of the fact that the reason they can do these things is because of the platforms that facilitate this kind of harm. We should have education for parents and for kids, taught through schools and available in a lot of different mediums and the places that kids go. While we can have education, we also need to make sure that we don't lose sight of the fact that there are other actors in the ecosystem who are contributing to the harm that we're all seeing.

June 13th, 2024 / 4:30 p.m.
See context

Bloc

Martin Champoux Bloc Drummond, QC

This type of content is easy to produce. There are even applications for that. It's quite appalling.

People are talking a great deal about Bill C‑63, which seeks to regulate hateful and inappropriate content online.

Beyond legislation, do you feel that the platforms could do more about this?

Do you think they are now able to do more technologically, contrary to what they claim?

June 13th, 2024 / 4:25 p.m.
See context

Associate Professor, University of British Columbia, As an Individual

Dr. Heidi Tworek

We didn't specifically discuss generative AI that much within our group, but I think that within Bill C-63 there's certainly at least an attention to the question of deepfakes. I think there's a concept of a duty to act responsibly that's certainly capacious enough to be able to deal with these kinds of updates. If we're thinking about generative AI companies, they too will have a duty to act responsibly and then I think the question becomes, what exactly should that duty to act responsibly look like in the case of generative AI? A lot of the things we've been talking about today would obviously be a very central part of that.

June 13th, 2024 / 4:25 p.m.
See context

Associate Professor, University of British Columbia, As an Individual

Dr. Heidi Tworek

Thank you very much.

I served with Mr. Krishnamurthy on the expert advisory group. This is something we grappled with quite a lot. Of course, major platforms like Facebook and so on have many employees and can easily staff up, but we often see these harms, particularly now with generative AI lowering the barrier to entry, that could be a couple of individuals who create complete havoc or very small firms.

I think there are two aspects to this question. One is the very important question of international co-operation on this. We've talked as if all of the individuals creating harm would be located in Canada, but the truth is that many of them may be located outside of Canada. I think we need to think about what international co-operation looks like. We have this for counterterrorism in the online space, and we need to think about this for deepfakes.

In the case of smaller companies, we can divide between those whom I think are being abused and then the question of how the new proposed online bill, Bill C-63, could have a digital safety commissioner who actually helps those smaller firms to ensure that these deepfakes are removed.

Finally, we have the question of the more nefarious smaller-firm actors and whether we need to have Bill C-63 expanded to be able to be nimble and shut down those kinds of nefarious actors more quickly—or, for example, tools that are only really being put up in order to create deepfakes of the terrible kinds that have been described by other witnesses.

I would just emphasize that the international co-operation, finally, is key. Taking things down in Canada only will potentially lead to revictimization, as something might be stored in a server in another country and then continually reuploaded.

June 13th, 2024 / 4:20 p.m.
See context

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you.

Bill C-63 does something very interesting. Rather than updating the Criminal Code to include deepfakes.... It doesn't do that at all. Deepfakes aren't mentioned in the bill, and folks are not protected from them—not in any criminal way. I believe that's an issue.

Secondly, platforms will be subject to perhaps an assessment and a ruling by an extrajudicial, bureaucratic arm that will be created, but again, there's nothing criminal attached to platforms allowing for the perpetuation of things like deepfakes or other non-consensual images.

Does that not concern you?

June 13th, 2024 / 4:10 p.m.
See context

Keita Szemok-Uto Lawyer, As an Individual

Madam Chair, committee members, thank you for the opportunity to speak before you this afternoon.

By way of brief background, my name is Keita Szemok-Uto. I'm from Vancouver. I was just called to the bar last month. I've been practising, primarily in family law, with also a mix of privacy and workplace law. I attended law school at Dalhousie in Halifax, and while there I took a privacy law course. I chose to write my term paper on the concept of deepfake videos, which we've been discussing today. I was interested in the way that a person could create a deepfake video, specifically a sexual or pornographic one, and how that could violate a person's privacy rights, and in writing that paper I discovered the clear gendered dynamic to the creation and dissemination of these kinds of deepfake videos.

As a case in point, around January this year somebody online made and publicly distributed sexually explicit AI deepfake images of Taylor Swift. They were quickly shared on Twitter, repeatedly viewed—I think one photo was seen as many as 50 million times. In an Associated Press article, a professor at George Washington University in the United States referenced women as “canaries in the coal mine” when it comes to the abuse of artificial intelligence. She is quoted, “It's not just going to be the 14-year-old girl or Taylor Swift. It's going to be politicians. It's going to be world leaders. It's going to be elections.”

Even back before this, in April 2022 it was striking to see the capacity for, essentially, anybody to take photos of somebody's social media, turn them into deepfakes and distribute them widely without, really, any regulation. Again, the targets of these deepfakes, while they can be celebrities or world leaders, oftentimes are people without the kinds of finances or protections of a well-known celebrity. Worst of all, I think, and in writing this paper, I discovered there is really no adequate system of law yet that protects victims from this kind of privacy invasion. I think that's something that really is only now being addressed somewhat with the online harms bill.

I did look at the Criminal Code, section 162, which prohibits the publication, distribution or sale of an intimate image, but the definition of “intimate image” in that section is a video or photo in which a person is nude and the person had a reasonable expectation of privacy when it was made or when the offence was committed. Again, I think the “reasonable expectation of privacy” element will come up a lot in legal conversations about deepfakes. When you take somebody's social media photo, which is taken and posted publicly, it's questionable whether they had a reasonable expectation of privacy when it was taken.

In the paper, I looked at a variety of torts. I thought that if the criminal law can't protect victims, perhaps there is a private course of action in which victims can sue and perhaps get damages or whatnot. I looked at public disclosure of private facts, intrusion upon seclusion and other torts as well, and I just didn't find anything really satisfied the circumstances of a pornographic deepfake scenario—again with the focus of reasonable expectation of privacy not really fitting the bill.

As I understand today, there have been recent proposals for legislation and legislation that are come into force. In British Columbia there's the Intimate Images Protection Act. That was from March 2023. The definition of “intimate image” in that act means a visual recording or visual simultaneous representation of an individual, whether or not they're identifiable and whether or not the image has been altered in any way, in which they're engaging in a sexual act.

The broadening of the definition of “intimate image”, as not just an image of someone who is engaged in a sexual act when the photo is taken but altered to make that representation, seems to be covered in the Intimate Images Protection Act. The drawback of that act is that, while it does provide a private right of action, the damages are limited to $5,000, which seems negligible in the grand scheme of things.

I suppose we'll talk more about Bill C-63 in this discussion, and I do think that it goes in the right direction in some regard. It does put a duty on operators to police and regulate what kind of material is online. Another benefit is that it expands the definitions, again, of the kinds of material that should be taken down.

That act, once passed, will require the operator to take down material that sexually victimizes a child or revictimizes a survivor—

June 13th, 2024 / 3:55 p.m.
See context

Monique St. Germain General Counsel, Canadian Centre for Child Protection Inc.

Thank you for the opportunity today.

My name is Monique St. Germain, and I am general counsel for the Canadian Centre for Child Protection, which is a national charity with the goal of reducing the incidence of missing and sexually exploited children.

We operate cybertip.ca, Canada's national tip line for reporting the online sexual exploitation of children. Cybertip.ca receives and analyzes tips from the public and refers relevant information to police and child welfare as needed. Cybertip averages over 2,500 reports a month. Since inception, over 400,000 reports have been processed.

When cybertip.ca launched in 2002, the Internet was pretty basic, and the rise of social media was still to come. Over the years, technology has rapidly evolved without guardrails and without meaningful government intervention. The faulty construction of the Internet has enabled online predators to not only meet and abuse children online but to do so under the cover of anonymity. It has also enabled the proliferation of child sexual abuse material, CSAM, at a rate not seen before. Victims are caught in an endless cycle of abuse.

Things are getting worse. We have communities of offenders operating openly on the Tor network, also known as the dark web. They share tips and tricks about how to abuse children and how to avoid getting caught. They share deeply personal information about victims. CSAM is openly shared, not only in the dark recesses of the Internet but on websites, file-sharing platforms, forums and chats accessible to anyone with an Internet connection.

Countries have prioritized the arrest and conviction of individual offenders. While that absolutely has to happen, we've not tackled a crucial player: the companies themselves whose products facilitate and amplify the harm. For example, Canada has only one known conviction and sentencing of a company making CSAM available on the Internet. That prosecution took eight years and thousands of dollars to prosecute. Criminal law cannot be the only tool; the problem is just too big.

Recognizing how rapidly CSAM was proliferating on the Internet, in 2017, we launched Project Arachnid. This innovative tool detects where CSAM is being made available publicly on the Internet and then sends a notice to request its removal. Operating at scale, it issues roughly 10,000 requests for removal each day and some days over 20,000. To date, over 40 million notices have been issued to over 1,000 service providers.

Through operating Project Arachnid, we've learned a lot about CSAM distribution, and, through cybertip.ca, we know how children are being targeted, victimized and sextorted on the platforms they use every day. The scale of harm is enormous.

Over the years, the CSAM circulating online has become increasingly disturbing, including elements of sadism, bondage, torture and bestiality. Victims are getting younger, and the abuse is more graphic. CSAM of adolescents is ending up on pornography sites, where it is difficult to remove unless the child comes forward and proves their age. The barriers to removal are endless, yet the upload of this material can happen in a flash, and children are paying the price.

It's no surprise that sexually explicit content harms children. For years, our laws in the off-line world protected them, but we abandoned that with the Internet. We know that everyone is harmed when exposed to CSAM. It can normalize harmful sexual acts, lead to distorted beliefs about the sexual availability of children and increase aggressive behaviour. CSAM fuels fantasies and can result in harm to other children.

In our review of Canadian case law regarding the production of CSAM in this country, 61% of offenders who produced CSAM also collected it.

CSAM is also used to groom children. Nearly half of the victims who responded to our survivor survey of victims of CSAM identified this tactic. Children are unknowingly being recorded by predators during an online interaction, and many are being sextorted thereafter. More sexual violence is occurring among children, and more children are mimicking adult predatory behaviour, bringing them into the criminal justice system.

CSAM is a record of a crime against a child, and its continued availability is ruining lives. Survivors tell us time and time again that the endless trading in their CSAM is a barrier to moving forward. They are living in constant fear of recognition and harassment. This is not right.

The burden of managing Internet harms has fallen largely to parents. This is unrealistic and unfair. We are thrilled to see legislative proposals like Bill C-63 to finally hold industry to account.

Prioritizing the removal of CSAM and intimate imagery is critical to protecting citizens. We welcome measures to mandate safety by design and tools like age verification or assurance technology to keep pornography away from children. We would also like to see increased use of tools like Project Arachnid to enhance removal and prevent the reuploading of CSAM. Also, as others have said, public education is critical. We need all the tools in the tool box.

Thank you.

June 13th, 2024 / 3:55 p.m.
See context

Associate Professor, University of British Columbia, As an Individual

Dr. Heidi Tworek

Bill C-63 is a step in the right direction to address a problem that, tragically, is swiftly worsening.

I'm looking forward to your questions.

June 13th, 2024 / 3:50 p.m.
See context

Dr. Heidi Tworek Associate Professor, University of British Columbia, As an Individual

Thank you, Madam Chair, for the opportunity to appear virtually before you to discuss this important topic.

I'm a professor and Canada research chair at the University of British Columbia in Vancouver. I direct the centre for the study of democratic institutions, where we research platforms and media. Two years ago I served as a member of the expert advisory group to the heritage ministry about online safety.

Today, I will focus on three aspects of harms related to illegal sexually explicit material online, before discussing briefly how Bill C-63 may address some of these harms.

First, the issue of illegal sexually explicit material online overlaps significantly with the broader question of online harm and harassment, which disproportionately affects women. A survey in 2021 found that female journalists in Canada were nearly twice as likely to receive sexualized messages or images, and they were six times as likely to receive online threats of rape or sexual assault. Queer, racialized, Jewish, Muslim and indigenous female journalists received the most harassment.

Alongside provoking mental health issues or fears for physical safety, many are either looking to leave their roles or unwilling to accept more public-facing positions. Others have been discouraged from pursuing journalism at all. My work over the last five years on other professional groups, including political candidates or health communicators, suggests very similar dynamics. This online harassment is a form of chilling effect for society as a whole when professionals do not represent the diversity of Canadian society.

Second, generative AI is accelerating the problem of illegal sexually explicit material. Let's take the example of deepfakes, which means artificially generated images or videos that swap faces onto somebody else's naked body to depict acts that neither person committed. Recent high-profile targets include Taylor Swift and U.S. Congresswoman Alexandria Ocasio-Cortez. These are not isolated examples. As journalist Sam Cole has put it, “sexually explicit deepfakes meant to harass, blackmail, threaten, or simply disregard women's consent have always been the primary use of the technology”.

Although deepfakes have existed for a few years, generative AI has significantly lowered the barrier to entry. The number of deepfake videos increased by 550% from 2019 to 2023. Such videos are easy to create, because about one-third of deepfake tools enable a user to create pornography, which comprises over 95% of all deepfake videos. One last statistic is that 99% of those featured in deepfake pornography are female.

Third, while it is mostly prima facie easy-to-define illegal sexually explicit material, we should be wary of online platforms offering solely automated solutions. For example, what if a lactation consultant is providing online guidance about breastfeeding? Wholly automated content moderation systems might delete such material, particularly if trained simply to search for certain body parts like nipples. Given that provincial human rights legislation protects breastfeeding in much of Canada, deletion of this type of content would actually raise questions about freedom of expression. If parents have the right to breastfeed in public in real life, why not to discuss it online? What this example suggests is that human content moderators remain necessary. It is also necessary that they are trained to understand Canadian law and cultural context and also to receive support for the very difficult kind of work they do.

Finally, let me explain how Bill C-63 might address some of these issues.

There are very legitimate questions about Bill C-63's proposed amendments to the Criminal Code and Canadian Human Rights Act, but as regards today's topic, I'll focus briefly on the online harms portion of the bill.

Bill C-63 draws inspiration from excellent legislation in the European Union, the United Kingdom and Australia. This makes Canada a fourth or fifth mover, if not increasingly an outlier in not regulating online safety.

However, Bill C-63 suggests three types of duties for platforms. The first two are a duty to protect children and a duty to act responsibly in mitigating the risks of seven types of harmful content. The third most stringent and relevant for today is a duty to make two types of content inaccessible—child sexual exploitation material and non-consensual sharing of intimate content, including deepfakes. This should theoretically protect the owners of both the face and the body used in a deepfake. A newly created digital safety commission would have the power to require removal of this content in 24 hours as well as impose fines and other measures for non-compliance.

Bill C-63 also foresees the creation of a digital safety ombudsperson to provide a forum for stakeholders and to hear user complaints if platforms are not upholding their legal duties. This ombudsperson might also enable users to complain about takedowns of legitimate content.

Now, Bill C-63 will certainly not resolve all issues around illegal sexually explicit material, for example, how to deal with copies of material stored on servers outside Canada—

June 11th, 2024 / 6:45 p.m.
See context

Liberal

Anna Gainey Liberal Notre-Dame-de-Grâce—Westmount, QC

Thank you very much.

Thank you, as well, to all the witnesses for being here.

Mr. Vachon, I also have a question for you.

Ideally, images of the sexual abuse of children would never be published online, of course. Bill C‑63 includes provisions requiring removal of material within 24 hours.

I'd like to know what you think of that tool proposed in the act. Further, are there other tools that could improve this bill or that we should consider?

June 11th, 2024 / 6:40 p.m.
See context

Conservative

Kevin Waugh Conservative Saskatoon—Grasswood, SK

Thank you, Ms. Thomas.

I sat and listened to debate on Bill C-63 Friday. There was, I think, a high school class watching from the gallery. It was kind of interesting, because as Bill C-63 was debated—and I give the teacher a lot of credit—the government had their statement and the opposition had their statements, and there's a trade-off between a guarantee of their security and their Charter of Rights. We have seen that in many of these bills.

Ms. Selby, what would your recommendation be to those high school students? Many of them are just coming into the adult world. What would your recommendation be on the Charter of Rights and their security around sexual exploitation?

June 11th, 2024 / 6:30 p.m.
See context

Founder and Mother, Amanda Todd Legacy Society

Carol Todd

Thank you for your kind words.

I'm going to be frank. Amanda died in 2012. We are now in 2024. We're almost at 12 years. I've stood up, I've used my voice and I've been an advocate. I've watched what happened in her life and I've talked to many people and organizations around the world. What you do as politicians and legislators is wonderful, but you put up so many roadblocks.

I'm going to be frank, and I'm not saying this to anyone specifically; I'm saying this generally.

So many roadblocks get put up by one political party versus another political party. I have sat on six standing committees since 2012, on technology-facilitated violence, on gender-based violence, on exploitation against children and young people, on other ones on intimate images, and now this one.

I could copy and paste facts that I talk about: more funding, more legislation, more education, more awareness. Standing committees then come out with a report. We see those reports, but we never know what happens at the end: Do these things really happen? Is there more funding in law enforcement for training officers and for their knowledge? Are there changes in legislation?

Right now we are looking at Bill C-63. I read the news and I look at the points of view. I have someone from the justice minister's office contacting me regularly, because I understand that second reading came up on Bill C-63 last Friday.

Then you go back to the comments, and all it amounts to is infighting and arguing. Will this bill be passed? Other parties say no, it shouldn't be passed.

We are harming Canadians, our children and our citizens when things don't get passed. If you look and do your research, you see that other countries have passed legislation similar to Bill C-63. Australia is already in its third or fourth revision of what they passed years ago. I was in Australia last year and I met the e-commissioner. I met law enforcement. I was a keynote speaker at one of their major exploitation conferences. I felt sad because Canada was represented by two officers in Ontario. Canada was so far behind.

We are a first world country, and our Canadians deserve to be protected. We need to make sure that everyone works on the legislation and on details. It's not just about passing laws: There are different silos. There's the education. There are the kids. There's the community. We all need to get involved. It's not about putting someone in jail because of.... It's about finding solutions that work. As a country, we are not finding those solutions that work right now. We aren't going to find every other predator in the world. Globally today, 750,000 predators are online looking for our children.

In my case, Amanda's predator came from the Netherlands. It's not just about one country, because the Internet is invisible fibres. We know that Nigeria has exploitation—

June 11th, 2024 / 6:25 p.m.
See context

Bloc

Martin Champoux Bloc Drummond, QC

So there is a lot of awareness raising and education that we need to do as parents, but also as a society and as legislators.

Since we're talking about legislation, I can't help but mention Bill C-63, which was recently introduced and which I hope will be considered as quickly as possible.

Have you had a chance to look at it? If so, what are your impressions of this bill, which may be intended to help you do your job?

June 11th, 2024 / 6:20 p.m.
See context

Liberal

Patricia Lattanzio Liberal Saint-Léonard—Saint-Michel, QC

My next question is addressed to Ms. Laidlaw.

Bill C-63 was developed to ensure compliance with all existing privacy laws and global best practices. Do you have any concerns related to the privacy implications of Bill S-210? Also, how do we ensure privacy is upheld in the development of online safety regulations?

June 11th, 2024 / 6:10 p.m.
See context

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you.

Thank you to all of the witnesses for your patience today.

My first question goes to Ms. Lalonde.

In an article that you recently wrote with regard to Bill C-63, you said it “contains...glaring gaps that risk leaving women and girls in Canada unfairly exposed.”

I'm wondering if you can expand on those gaps that you see within the legislation that would perhaps leave women and children vulnerable.

Criminal CodePrivate Members' Business

June 11th, 2024 / 6 p.m.
See context

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Madam Speaker, I rise to debate this bill today, and I would like to focus my comments on a specific aspect of coercive control, for which there remains very few easy-to-access and easy-to-deploy de-escalation tools for victims. It is my hope that parliamentarians in the other place will consider the addition of these components to this bill, particularly as it pertains to specific tools to assist law enforcement officials in stopping coercive control from happening.

To set the context for this issue, I would like to refer to the Women's Legal Education & Access Fund, or LEAF. It developed a position paper on the criminalization of coercive control in response to this bill. In it, it defines “coercive control” as follows:

Coercive control is a concept used to describe a pattern of abusive behaviors in intimate partner relationships, based on tactics of intimidation, subordination, and control. This can include, among others, behaviors such as isolation, stalking, threats, surveillance, psychological abuse, online harassment, and sexual violence.

Other sources discussed threats of extortion, including so-called revenge porn, as one of the abusive behaviours also used to exert coercive control.

In its paper, LEAF raises the concern that the process of criminalizing coercive control may encounter significant challenges to legal success and that it may be “difficult to translate clearly into actionable criminal law.” One of the recommendations it makes to at least partially address this issue reads as follows: “Federal, provincial and territorial governments should take a proactive approach in focusing on the prevention of intimate partner violence.”

I would like to focus on two actionable, concrete ways to prevent two specific behaviours or components of coercive control: online harassment and revenge porn. In nearly nine years of power, the Liberal government has not taken material action to address the growing threat and breadth of online harassment, particularly as it relates to coercive control. The government's recently introduced and widely criticized Bill C-63, which many experts say would force Canadians to make trade-offs between their charter rights and their safety, does not adequately address the issue of women who are subject to a pattern of abusive behaviour online. Even if it did, today the minister admitted in the Toronto Star that the bill's provisions, which rely on the creation of an onerous new three-headed bureaucracy, would take years to functionally come into force.

Canadian women do not have time to wait for the minister's foot-dragging. Online harassment has been an issue for years, and the government has not ensured that our laws have kept pace with this issue. For evidence of this, I encourage colleagues to read the Canadian Resource Centre for Victims of Crime's guide to cyberstalking, which admits as much, saying that, when victims seek to report incidents of cyberstalking, “individual officers may be unfamiliar with the crimes or technology in question and may be uncertain about how to proceed.”

Indeed, last month, an article was released that was headlined, “RCMP boss calls for new politician anti-threats law”. It cited the need for more provision to protect politicians from online harassment. I asked myself, if the RCMP cannot protect me, how are they going to protect anyone in my community from the same threat? We should all reflect upon this issue because across Canada, at this very moment, women are receiving repeated, unwanted, harassing digital communications, and the best that many victim services groups can do to help, because of government inaction, is offer advice on how they can attempt to be less of a victim.

Women should not have to alter their behaviour. Potential harassers should be held to account, and their behaviour should be de-escalated before it escalates into physical violence. To do this, I encourage parliamentarians in the other place to consider the following in their review of this bill. They should ask the government to create a new criminal offence of online harassment that would update the existing crime of criminal harassment to address the ease and anonymity of online criminal harassment, which groups, in the deliberation of this bill, have noted as a component of coercive control.

Specifically, this new provision would apply to those who repeatedly send threatening or sexually explicit messages or content to people across the Internet and social media when they know, or should know, that it is not welcome. This could include aggravating factors for repeatedly sending such material anonymously and be accompanied by a so-called digital restraining order, which would allow victims of online criminal harassment to apply to a judge to identify the harasser and end the harassment. This would give police and victims clear and easy-to-understand tools to prevent online harassment and also prevent the escalation of this abuse to physical violence.

It would also allow for national awareness and education campaigns to be developed on what happens when someone criminally harasses somebody online. This would address a major issue of intimate partner violence and make it easier to materially and concretely stop coercive control. Members of the governing Liberal Party agreed to the need for these measures in a recent meeting of PROC related to the online harassment of elected officials.

In addition, the government must do more to address so-called revenge porn as a component of coercive control. An academic article entitled “Image-Based Sexual Abuse as a Means of Coercive Control: Victim-Survivor Experiences” states:

Victim-support advocates and domestic violence sector workers have increasingly acknowledged the role that image-based sexual abuse plays in the perpetuation of intimate partner abuse.... Image-based sexual abuse refers to the non-consensual taking or sharing of nude or sexual images (photos or videos), including making threats to share intimate images.... In the context of an intimate relationship, image-based sexual abuse can include any of the following acts: taking or sharing nude or sexual images without consent; threats to share intimate images to coerce a partner into sharing more intimate images or engage them in an unwanted act; and/or recording and or disseminating of sexual assault imagery.

However, colleagues, this has become even more of a concern given the advent of deepfake intimate images. I have been raising this issue in the House for over a year, and the government has still not moved to update the definition of “intimate images” in Canada's Criminal Code to specifically include deepfake intimate images. This component is not in Bill C-63.

This inaction is already harming women. A Winnipeg high school student had deepfaked intimate images circulated against her; no charges were filed, likely because of the gap in our law. As it relates to coercive control, can members imagine how easy it would be for an abuser to create so-called revenge porn to use against their victim using online technology? The government must act now, but if it will not, we parliamentarians must. Therefore, I ask members of the other place to consider the following in the review of their bill.

They should consider updating Canada's existing laws on the non-consensual distribution of intimate images to ensure that the distribution of intimate deepfakes is also criminalized via a simple definition update in the Criminal Code. This could be done easily and likely with all-party support in this place. It is shameful that the government has not moved to do that to date. In addition, the government admitted today in the Toronto Star that it is committed to dogmatically sticking with Bill C-63 as its only way to address online harms. This is despite widespread criticism and despite admitting that even the few supportable provisions in the bill would not come into force for years. Therefore, we in the opposition must look for ways to address these issues outside the government, particularly since online harm is a growing component of coercive control.

In addition to what I have already suggested, as parliamentarians, we should address the broader issue of online harms by doing things such as precisely specifying the duty of care required by online platforms. This should be done through legislation and not backroom regulation. The duty of care could include mechanisms to provide parents with the safeguards, controls and transparency to prevent harm to their kids when they are online; mechanisms to prevent and mitigate self-harm, mental health disorders, addictive behaviour, bullying and harassment, sexual violence and exploitation, and the promotion and marketing of products or services that are unlawful for minors; and mechanisms to implement privacy-preserving and trustworthy age verification methods, which many platforms have already built, to restrict access to any content that is inappropriate for minors while prohibiting the use of a digital ID in any of these mechanisms.

As well, we require mechanisms to give adults a clear and easy-to-use way to opt out of any default parental controls that a duty of care might provide for. Then, through legislation, we should ensure the appropriate enforcement of such measures through a system of administrative penalties and consequences by government agencies and bodies that already exist. In addition, the enforcement mechanisms could provide for the allowance of civil action when duties of care are violated in an injurious way.

To address coercive control, we need to address online harassment. I hope that colleagues in the other place will consider the suggestions I have made to do just that.

June 11th, 2024 / 5:20 p.m.
See context

Founder and Mother, Amanda Todd Legacy Society

Carol Todd

Thank you.

The prevalence of sexually explicit material has increased due to the widespread use of the Internet. It manifests in various forms, including visual representations, photos, videos, films, written content, audio recordings and print material. The volume grows exponentially day by day. The protection that we have for our children and for our adults isn't there on the Internet. Big tech companies need to take responsibility. I know that throughout the world now, there are more and more lawsuits where big tech companies are being held responsible.

When accessing sexually explicit material, some of the challenges that we are faced with include access to violent and explicit content that can impact sexual attitudes and behaviours, the harm to children through the creation, sharing and viewing of sexual abuse material, increased violence against women and girls, as well as sex trafficking. It can also influence men's views on women and relationships.

In my notes, I comment that we stereotype often that it is men who are violating others, but the offenders can be men and they can be women. They can also be other children—peer violence to peer violence. There is no one set rule on who is creating and who is causing, but we know that those who become traumatized and victimized can be anyone.

What more needs to be done? I'll just go through this quickly.

As an educator, I feel strongly that increasing education is crucial. The awareness and education needs to go to our children and our young adults and to our families.

We need stronger regulations and laws. Bill C-63 is one of them. I know that in the province of B.C., more legislation has been passed and is done.

We need to improve our online platforms and make them accountable. We need to increase parental controls and monitoring, and we need to encourage reporting.

We also need to promote positive online behaviours. Social emotional learning and social responsibility are part of the awareness and the education that needs to come on.

We need to be a voice. We need to stand up, and we also need to do more.

Thank you for the time, and I encourage questions so that I can finish reading my notes.

Thank you.

June 11th, 2024 / 5:15 p.m.
See context

Carol Todd Founder and Mother, Amanda Todd Legacy Society

I'd like to thank the committee for inviting me to speak. It's an honour to be able to share knowledge.

I'm not coming as a researcher or someone who has studied this. I'm coming as a mom, and I'm coming as a parent and as an educator with lived experience, so confining my conversation to five minutes was difficult. I've written some notes that I will read until my time is up, and I do welcome questions at the end.

I have spent the last 12 years, I guess, looking at learning about sexual exploitation and online behaviours, and it is really hard to imagine the horrid things that are happening out there to our children. As a side note, I believe that Bill C-63 needs to be passed with some tweaks, because it is the safety net for our children and Canadians online.

This subject holds significant importance and warrants ongoing dialogue to tackle not just the ease of access to such material but also the profound harm that can be inflicted upon those who encounter sexually explicit content every day.

I am Carol Todd, widely known as Amanda Todd's mother. In addition, I am an educator in a British Columbia school district with my work primarily centred on digital literacy, online safety and child abuse prevention with a focus on exploitation and sextortion.

Empowering students, teachers and families with the knowledge and skills to navigate the digital world safely is essential, important and now a passion of mine. I will continue to talk forever about how we can keep families and children safe, because this is what we needed for my daughter, and it came a bit too late.

Amanda tragically took her life on October 10, 2012, following extensive online exploitation, tormenting harassment and cyber-abuse. Her story very much relates to what happens when there is creation, possession and distribution of sexually explicit material online and how easily others can access it as it becomes embedded online forever.

Amanda's story garnered global attention after her tragic death. To reclaim her voice while she was alive, Amanda created a video that she shared on YouTube five weeks before her passing. It has been viewed 50 million times worldwide and is now used as a learning tool for others to start the discussion and for students to learn more about what happened to her and why it's so important that we continue to talk about online safety, exploitation and sextortion.

As another side note, it has taken forever for us to catch up on the conversation of exploitation and sextortion. It was something that no one was able to talk about 12 years ago, in 2012. It has evolved because of the increase of exploitation and sextortion online, not only happening to young girls, young boys and young adults but men and women. The nefarious offenders online, because they've gotten away with it due to so many levels of the Internet these days, have increased in numbers and have caused much trauma and much harm, as this is a form of abuse and violence.

Over the past decade, we've observed rapid changes in the technology landscape. Technology primarily used to be used as a communication tool for email, and now we have seen the evolvement of applications for fun. They were explained as safe, but now we know differently, because they have increased the chaos, concern and undesirable behaviours online for Canadians and for all.

This isn't just a Canadian problem. It's a global problem, and I have watched other countries create legislation, laws and safety commissions, just as Canada, with Bill C-63, now wants an e-safety commissioner board, and I think this is a brilliant idea. For anyone here who gets to vote, I hope that it does pass.

The prevalence of sexually explicit material has markedly increased—

June 11th, 2024 / 5:10 p.m.
See context

Dr. Emily Laidlaw Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Thank you for inviting me.

With my time, I'm going to focus on social media regulation and on Bills C-63 and S-210.

Social media has historically been lightly regulated. Online safety has only been addressed if companies felt like it or they were pressured by the market. There have been some innovative solutions, and we need them to continue to innovate, but safety has generally taken a back seat to other interests.

Social media companies have also privately set rules for freedom of expression, privacy and children's rights. There are no minimum standards and no ways to hold companies accountable. That is changing globally. Many jurisdictions have passed online harms legislation. The online harms act, which is part of Bill C-63, aligns with global approaches. In my view, with tweaks, Bill C-63 is the number one avenue to address illegal sexually explicit content and sexual exploitation.

Bill S-210 would mandate age verification to access sites with sexually explicit material. It is a flawed bill, yes, but more importantly, it is unnecessary for two reasons.

First, age verification is the crucial next frontier of online safety, but it is about more than sexually explicit material and about child safety broadly. The technology is evolving, and if we are committed to freedom of expression, privacy and cybersecurity, how this technology is used must be scrutinized closely.

Second, age verification is only one tool in the tool box. A holistic approach is needed whereby safety is considered in product design, content moderation systems and the algorithms. Let me give you a few examples of safety by design that does not involve age verification.

Child luring and sextortion rates are rising. What steps could social media take? Flag unusual friend requests from strangers and people in distant locations. Remove network expansion prompts whereby friends are recommended based on location and interest. Provide easy-to-use complaints mechanisms. Provide user empowerment tools, like blocking accounts.

The non-consensual disclosure of intimate images and child sexual abuse material requires immediate action. Does the social media service offer quick takedown mechanisms? Does it follow through with them? Does it flag synthetic media like deepfakes? How usable are the complaints mechanisms?

For example, Discord has been used to livestream child sexual exploitation content. The Australian e-safety commissioner reported that Discord does not enable in-service reporting of livestreamed abuse. This is an easy fix.

The last example is that the Canadian child protection centre offers a tool to industry, called Project Arachnid, to proactively detect child sexual abuse material. Should social media companies be using this to detect and remove content?

In my view, Bill C-63, again with tweaks, is the best avenue to address sexual exploitation generally. I think the focus should be on how to improve that bill. There are many reasons for that. I'll give two here.

First, the bill imposes three different types of responsibility. Vivek discussed this. Notably, the strongest obligation is the power of the commissioner to order the removal of child sexual abuse content and non-consensual disclosure of intimate images. This recognizes the need for the swift removal of the worst kinds of content.

Second, all of this would be overseen by a digital safety commission, ombudsperson and office. Courts are never going to be fast to resolve the kinds of disputes here, and they're costly. The power of the commissioner to order the removal of the worst forms of content is crucial to providing access to justice.

Courts are just ill-suited to oversee safety by design as well, which is necessarily an iterative process between the commission and companies. The tech evolves, and so do the harm and the solutions.

With my remaining time, I want to flag one challenge before I close, which Vivek mentioned as well. That is private messaging. Bill C-63 does not tackle private messaging. This is a logical decision; otherwise, it opens a can of worms.

Many of the harms explored here happen on private messaging. The key here is not to undermine privacy and cybersecurity protections. One way to bring private messaging into the bill and avoid undermining these protections is to impose safety obligations on the things that surround private messaging. I've mentioned many, such as complaints mechanisms, suspicious friend requests and so on.

Thank you for your time. I welcome questions.

June 11th, 2024 / 5:05 p.m.
See context

Associate Professor of Law, University of Colorado Law School, As an Individual

Vivek Krishnamurthy

Very well.

The only thing I will say to conclude is that Bill C-63 does not deal with messaging software, with things like WhatsApp, which are a primary vector by which this kind of content moves. I think that is a good call, because of the difficulty in doing so. It's something that requires further study, a lot of work and a lot of thought on dealing with that particular piece of the distribution problem.

Thank you, Madam Chair.

June 11th, 2024 / 5 p.m.
See context

Vivek Krishnamurthy Associate Professor of Law, University of Colorado Law School, As an Individual

Thank you, Madam Chair.

I'm very honoured to be here. I apologize in advance that I also have a hard deadline, due to child care obligations, so let me get right to it.

I'm not an expert on the harms caused by what the committee is studying, that is, exposure to illegal explicit sexual content. The focus of my remarks today will be on the technological means by which this kind of content is distributed and what can be done about it in compliance with the charter.

Just to frame my remarks, I think we can distinguish between two kinds of material. There's certain material that's per se illegal. Child sexual exploitation material is always illegal, but we face a challenge with material that's what I would call “conditionally illegal”. I think non-consensual distribution of intimate imagery falls into this category, because the illegality depends on whether the distribution is consensual or not—or the creation, for that matter.

The challenge we face is in regulating the distribution of this content by means of distribution that are general purpose. Take a social media platform, whichever one you want—Instagram, TikTok—or take a messaging platform such as WhatsApp. The problem with regulating the distribution of this content on those platforms is, of course, that we use them for many positive purposes, but they of course can be used for ill as well.

I'd like to pivot briefly to discuss the online harms act, which is, of course, before Parliament right now and which I think offers a good approach to dealing with one part of the distribution challenge with regard to social media platforms. These are platforms that take content generated by individuals and make them available to a large number of people. I think the framework of this law is quite sensible in that it creates “a duty to act responsibly”, which gets to the systemic problem of how platforms curate and moderate content. The idea here is to reduce the risk that this kind of content does get distributed on these platforms.

The bill is, in my view, well designed, in that there's also a duty to remove content, especially child sexual exploitation material and non-consensual distribution of intimate imagery, to the extent that platforms' own moderation efforts or user reports flag that content as being unlawful. This is a very sensible approach that I think is very compliant with the charter in its broad strokes.

The challenge, however, is with the effectiveness of these laws. It's very hard to determine before the fact how effective these are, because of issues with determining both the numerator and the denominator. I don't want to take us too much into mathematical territory, but it's very hard for us to measure the prevalence of this content online or on any given platform. It's just hard to identify, in part because the legality—or not—of the content depends on the conditions in which it's distributed. Then, on the numerator, which is how well the platforms are doing the job of getting it off, again, we have issues with identifying what's in and what's out. This is a step forward, but the bill has limitations.

One way of understanding the limitations is with an analogy that a friend of mine, Peter Swire, who teaches at Georgia Tech, calls the problem of “elephants and mice”. There are some elephants in the room, which are large, powerful and visible actors. These are your Metas and your TikToks, or even a company like Pornhub, which has a very large and significant presence. These are players that can't hide from the law, but what is difficult in this space is that there are many mice. Mice are small, they're furtive and they reproduce very quickly. They move around in darkness. This law is going to be very difficult to implement with regard to those kinds of actors, the ones that we find on the darker corners of the Internet.

Again, I think Bill C-63 is a very—

June 10th, 2024 / 1:25 p.m.
See context

Chair, Canadian Muslim Lawyers Association

Husein Panju

We're familiar with Bill C-63, which is currently before the House. It's a complex issue. I think there needs to be some more dialogue with our groups on a more directed basis. You're right: Equity-seeking groups like ours are often the victims and the targets of hate speech, but there also needs to be some more consultation to ensure that any such measures do not overly censor legitimate, non-hateful speech from equity-seeking groups as well.

Online Harms ActGovernment Orders

June 7th, 2024 / 1:10 p.m.
See context

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Mr. Speaker, it is a pleasure to be able to rise and speak to Bill C-63.

We often talk about the communities and neighbourhoods in which we live. We do this not only as parliamentarians but also as politicians in general, whether at the municipal, provincial, or federal level. We talk about how we want people to feel safe. People need to feel safe in their homes, in their communities and in the places where they live. That has always been a priority for the current government and, I would like to think, for all parliamentarians of all political stripes. However, sometimes we need to look at finding a better definition of what we mean when we talk about keeping people safe in our communities.

The Internet is a wonderful thing, and it plays a critical and important role in society today. In fact, I would argue that, nowadays, it is an essential service that is virtually required in all communities. We see provincial and national governments investing greatly to ensure that there is more access to the Internet. We have become more and more dependent on it in so many different ways. It is, for all intents and purposes, a part of the community.

I could go back to the days when I was a child, and my parents would tell me to go outside and play. Yes, I would include my children as having been encouraged to go outside and play. Then things such as Nintendo came out, and people started gravitating toward the TV and playing computer games. I have grandchildren now, and I get the opportunity to see my two grandsons quite a bit. I can tell members that, when I do, I am totally amazed at what they are participating in on the Internet and with respect to technology. There are incredible programs associated with it, from gaming to YouTube, that I would suggest are a part of the community. Therefore, when we say that we want to protect our children in our communities when they are outside, we also need to protect them when they are inside.

It is easy for mega platforms to say it is not their responsibility but that of the parent or guardian. From my perspective, that is a cop-out. We have a responsibility here, and we need to recognize that responsibility. That is what Bill C-63 is all about.

Some people will talk about freedom of speech and so forth. I am all for freedom of speech. In fact, I just got an email from a constituent who is quite upset about how the profanity and flags being displayed by a particular vehicle that is driving around is promoting all sorts of nastiness in the community. I indicated to them that freedom of speech entitles that individual to do that.

I care deeply about the fact that we, as a political party, brought in the Charter of Rights and Freedoms, which guarantees freedom of speech and expression. At the end of the day, I will always advocate for freedom of speech, but there are limitations. I believe that, if we look at Bill C-63, we can get a better sense of the types of limitations the government is talking about. Not only that, but I believe they are a reflection of a lot of the work that has been put together in order to bring the legislation before us today.

I understand some of the comments that have been brought forward, depending on which political parties addressed the bill so far. However, the minister himself has reinforced that this is not something that was done on a napkin; it is something that has taken a great deal of time, effort and resources to make sure that we got it right. The minister was very clear about the consultations that were done, the research that took a look at what has been done in other countries, and what is being said here in our communities. There are a great number of people who have been engaged in the legislation. I suspect that once it gets to committee we will continue to hear a wide spectrum of opinions and thoughts on it.

I do not believe that as legislators we should be put off to such a degree that we do not take action. I am inclined to agree with the minister in saying that this is a holistic approach at dealing with an important issue. We should not be looking at ways to divide the legislation. Rather, we should be looking at ways it can be improved. The minister himself, earlier today, said that if members have ideas or amendments they believe will give more strength to the legislation, then let us hear them. Bring them forward.

Often there is a great deal of debate on something at second reading and not as much at third reading. I suggest that the legislation before us might be the type of legislation that it would be beneficial to pass relatively quickly out of second reading, after some members have had the opportunity to provide some thoughts, in favour of having more reading or debate time at third reading but more specifically to allow for time at the committee stage. That would allow, for example, members the opportunity to have discussions with constituents over the summer, knowing full well that the bill is at committee. I think there is a great deal of merit to that.

There was something that spoke volumes, in terms of keeping the community safe, and the impact today that the Internet has on our children in particular. Platforms have a responsibility, and we have to ensure that they are living up to that responsibility.

I want to speak about Carol Todd, the mother of Amanda Todd, to whom reference has been made already. Ultimately, I believe, she is one of the primary reasons why the legislation is so critically important. Amanda Michelle Todd was born November 27, 1996, and passed away October 10, 2012. Colleagues can do the math. She was a 15-year-old Canadian student and a victim of cyber-bullying who hanged herself at her home in Port Coquitlam, British Columbia. There is a great deal of information on the Internet about to Amanda. I thank her mother, Carol, for having the courage to share the story of her daughter, because it is quite tragic.

I think there is a lot of blame that can be passed around, whether it is to the government, the private sector or society, including individuals. Carol Todd made reference to the thought that her daughter Amanda might still actually be alive if, in fact, Bill C-63 had been law at the time. She said, “As a mom, and having gone through the story that I've gone through with Amanda, this needs to be bipartisan. All parties in the House of Commons need to look in their hearts and look at young Canadians. Our job is to protect them. And parents, we can't do it alone. The government has to step in and that's what we are calling for.”

That is a personal appeal, and it is not that often I will bring up a personal appeal of this nature. I thought it was warranted because I believe it really amplifies and humanizes why this legislation is so important. Some members, as we have seen in the debate already, have indicated that they disagree with certain aspects of the legislation, and that is fine. I can appreciate that there will be diverse opinions on this legislation. However, let us not use that as a way to ultimately prevent the legislation from moving forward.

Years of consultation and work have been put into the legislation to get it to where it is today. I would suggest, given we all have had discussions related to these types of issues, during private members' bills or with constituents, we understand the importance of freedom of speech. We know why we have the Charter of Rights. We understand the basics of hate crime and we all, I believe, acknowledge that freedom of speech does have some limitations to it.

I would like to talk about some of the things we should think about, in terms of responsibilities, when we think about platforms. I want to focus on platforms in my last three minutes. Platforms have a responsibility to be responsible. It is not all about profit. There is a societal responsibility that platforms have, and if they are not prepared to take it upon themselves to be responsible, then the government does need to take more actions.

Platforms need to understand and appreciate that there are certain aspects of society, and here we are talking about children, that need to be protected. Platforms cannot pass the buck on to parents and guardians. Yes, parents and guardians have the primary responsibility, but the Internet never shuts down. Even parents and guardians have limitations. Platforms need to recognize that they also have a responsibility to protect children.

Sexually victimized children, and intimate content that is shared without consent, are the types of things platforms have to do due diligence on. When the issue is raised to platforms, there is a moral and, with the passage of this legislation, a legal obligation for them to take action. I am surprised it has taken this type of legislation to hit that point home. At the end of the day, whether a life is lost, people being bullied, or depression and mental issues are caused because of things of that nature, platforms have to take responsibility.

There are other aspects that we need to be very much aware of. Inciting violent extremism or terrorism needs to be flagged. Content that induces a child to harm themselves also needs to be flagged. As it has been pointed out, this legislation would have a real, positive, profound impact, and it would not have to take away one's freedom of speech. It does not apply to private conversations or communications.

I will leave it at that and continue at a later date.

Online Harms ActGovernment Orders

June 7th, 2024 / 1:05 p.m.
See context

Bloc

Martin Champoux Bloc Drummond, QC

Mr. Speaker, I know that my colleague from New Westminster—Burnaby also cares about regulating what happens on the web. We had the opportunity to work together at the Standing Committee on Canadian Heritage on various topics that have to do with this issue.

We have been waiting for Bill C‑63 for a long time. I think that there is consensus on part 1. As the Bloc Québécois has been saying all day, it is proposing that we split the bill in order to quickly pass part 1, which is one part we all agree on.

The trouble is with part 2 and the subsequent parts. There are a lot of things that deserve to be discussed. There is one in particular that raises a major red flag, as far as I am concerned. It is the idea that a person could file a complaint because they fear that at some point, someone might utter hate speech or commit a crime as described in the clauses of the bill. A complaint could be filed simply on the presumption that a person might commit this type of crime.

To me, that seems to promote a sort of climate of accusation that could lead to paranoia. It makes me think of the movie Minority Report. I am sure my colleague has heard of it. I would like his impressions of this type of thing that we find in Bill C‑63.

Online Harms ActGovernment Orders

June 7th, 2024 / 12:45 p.m.
See context

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, first of all, as we mentioned earlier, the NDP believes that certain aspects of Bill C‑63 are important and will help address a situation that calls for measures to counter online harm. However, other elements of this bill are not as clear and raise important questions.

We feel it is really necessary to pass the bill, send it to committee and give that committee the opportunity to do a thorough review. Parts of this bill are well done, but other parts need clarification and still others raise concerns. We therefore have some reservations.

This bill has been needed for years. The Liberal government promised it within 100 days of the last election, but it took almost three years, as members know. Finally, it has been introduced and is being examined. As parliamentarians, we need to do the work necessary to get answers to the questions people are asking, improve the parts of the bill that need improving and pass those parts that are sorely needed.

If parts of the bill cannot be passed or seem not to be in the public interest after a thorough examination in committee, it is our responsibility to withdraw them. However, there is no question that we need this legislation.

The harm being done to children is definitely rising. The idea that people can approach children, without restriction, to encourage them to self-harm or commit suicide should be something that our society will not tolerate. The fact that we have these web giants or platforms that promote child pornography is unacceptable. It should not be happening in our society. We have to acknowledge the importance of implementing laws to prevent this from happening. Hate speech is another issue. We are seeing a disturbing rise in violence in society, which is often fomented online.

For all of these reasons, we are going to pass this bill at second reading. We are going to send it to committee. This part of the process is very important to us. All answers must be obtained and all necessary improvements to the bill must be made in committee.

I do not think that anyone in the Parliament of Canada would like to vote against the principle of having such legislation in place. In practice, the important role of parliamentarians is to do everything in their power to produce a bill that achieves consensus, with questions answered and the necessary improvements put in place.

There is no doubt about the need for the bill. The NDP has been calling for the bill for years. The government promised it after 100 days. Canadians had to wait over 800 days before we saw the bill actually being presented.

In the meantime, the reality is that we have seen more and more cases of children being induced to harm themselves. This is profoundly disturbing to us, as parents, parliamentarians and Canadians, to see how predators have been going after children in our society. When we are talking about child pornography or inducing children to harm themselves, it is something that should be a profound concern to all of us.

Issues around the sharing of intimate content online without permission, in a way that it attacks victims, is also something that we have been calling for action on. It is important for parliamentarians to take action.

We have seen a steady and disturbing rise in hate crimes. We have seen it in all aspects of racism and misogyny, homophobia and transphobia, anti-Semitism and Islamophobia. All of these toxic sources of hate are rising.

I would note two things. First, the rise in anti-Semitism is mirrored by the rise in Islamophobia. Something we have seen from the far right is that they are attacking all groups.

Second, as the ADL has pointed out, in 2022 and 2023, all the violent acts of mass murder that were ideologically motivated came from the far right in North America. These are profoundly disturbing acts. We have a responsibility to take action.

The fact that the government has delayed the bill for so long is something we are very critical of. The fact that it is before us now means that, as parliamentarians, we have the responsibility to take both the sections of the bill where there is consensus and parts of the bill where there are questions and concerns being raised that are legitimate, and we must ensure that the committee has all the resources necessary, once it is referred to the committee in principle.

That second reading vote is a vote in principle, supporting the idea of legislation in this area. However, it is at the committee stage that we will see all the witnesses who need to come forward to dissect the bill and make sure that it is the best possible legislation. From there, we determine which parts of the bill can be improved, which parts are adequate and which parts, if they raise legitimate concerns and simply do not do the job, need to be taken out.

Over the course of the next few minutes, let us go through where there is consensus and where there are legitimate questions being raised. I want to flag that the issue of resources, which has been raised by every speaker so far today, is something that the NDP takes very seriously as well.

In the Conservative government that preceded the current Liberal government, we saw the slashing of crime prevention funding. This basically meant the elimination of resources that play a valuable role in preventing crimes. In the current Liberal government, we have not seen the resources that need to go into countering online harms.

There are legitimate questions being raised about whether resources are going to be adequate for the bill to do the job that it needs to do. Those questions absolutely need to be answered in committee. If the resources are not adequate, the best bill in the world is not going to do the job to stop online harms. Therefore, the issue of resources is key for the NDP as we move forward.

With previous pieces of legislation, we have seen that the intent was good but that the resources were inadequate. The NDP, as the adults in the House, the worker bees of Parliament, as many people have attested, would then push the Liberal government hard to actually ensure adequate resources to meet the needs of the legislation.

Legislation should never be symbolic. It should accomplish a goal. If we are concerned about online harms, and so many Canadians are, then we need to ensure that the resources are adequate to do the job.

Part 1 of the bill responds to the long-delayed need to combat online harms, and a number of speakers have indicated a consensus on this approach. It is important to note the definitions, which we certainly support, in the intent of part 1 of the bill, which is also integrated into other parts of the bill. The definitions include raising concerns about “content that foments hatred”, “content that incites violence”, “content that incites violent extremism or terrorism”, “content that induces a child to harm themselves”, “content that sexually victimizes a child or revictimizes a survivor”, “content used to bully a child” and “intimate content communicated without consent”.

All of these are, I think it is fair to say, definitions that are detailed in how they address each of those categories. This is, I think, a goal all parliamentarians would share. No one wants to see the continued increase in sexual victimization of children and content that induces a child to harm themselves.

I have raised before in the House the sad and tragic story of Molly Russell. I met with her father and have spoken with the family. The tragic result of her having content forced upon her that led to her ending her own life is a tragedy that we have seen repeated many times, where the wild west of online platforms is promoting, often through secret algorithms, material that is profoundly damaging to children. This is something that is simply unacceptable in any society, yet that content proliferates online. It is often reinforced by secret algorithms.

I would suggest that, while the definitions in the bill are strong concerning the content we do not want to see, whether it is violent extremism or the victimization of children, the reality is that it is not tackling a key element of why this harmful online content expands so rapidly, and with such disturbing strength, and that is the secretive algorithms online platforms use. There is no obligation for these companies to come clean about their algorithms, yet these algorithms inflict profound damage on Canadians, victimize children and, often, encourage violence.

One of the pieces I believe needs to be addressed through the committee process of the bill is why these online platforms have no obligation at all to reveal the algorithms that produce, in such disturbing strength, this profoundly toxic content. The fact is that a child, Molly Russell, was, through the algorithms, constantly fed material that encouraged her to ultimately end her own life, and these companies, these massive corporations, are often making unbelievable profits.

I will flag one more time that Canada continues to indirectly subsidize both Meta and Google, to the tune of a billion dollars a year, with indirect subsidies when there is no responsibility from these online platforms at all, which is something I find extremely disturbing. These are massive amounts of money, and they meet with massive profits. We have, as well, these significant subsidies, which we need to absolutely get a handle on. We see the fact that these algorithms are present, and not being dealt with in the legislation, as a major problem.

Second, when we look at other aspects of the bill and the detail that I have just run through in terms of the actual content itself, the definitions in part 1 are not mirrored by the same level of detail in part 2 of the bill, which is the aspects of the Criminal Code that are present. The Criminal Code provisions have raised concerns because of their lack of definition. The concerns around part 2, on the Criminal Code, are something that firmly needs to be dealt with at the committee stage. Answers need to be obtained, and amendments need to be brought to that section. I understand that as part of the committee process there will be rigorous questions asked on part 2. It is a concern that a number of people and a number of organizations have raised. The committee step in this legislation is going to be crucial to improving and potentially deleting parts of the bill, subject to the rigorous questioning that would occur at the committee stage.

The third part of the bill addresses issues around the Canadian Human Rights Commission. We were opposed to the former Harper government's gutting of the ability of the Human Rights Commission to uphold the Charter of Rights and Freedoms. Under the Charter of Rights and Freedoms, the Constitution that governs our country, Canadians have a right to be free from discrimination. The reality of the Harper government's cuts to that portion of the Canadian Human Rights Commission is something that we found disturbing at the time. The reality is that part 3, the question of resources and whether the Canadian Human Rights Commission has the ability to actually respond to the responsibilities that would come from part 3 of the bill, is something that we want to rigorously question witnesses on. Whether we are talking about government witnesses or the Canadian Human Rights Commission, it is absolutely important that we get those answers before we think of the next steps for part 3.

Finally, there is part 4, an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service. That section of the bill as well is something that, I think it is fair to say, should receive some level of consensus from parliamentarians.

In short, at second reading, as members well know, the intent of the debate and discussion is whether or not we are in agreement with the principle of the bill. New Democrats are in agreement with the principle of the bill. We have broad concerns about certain parts of the bill. The intent around part 1, though, the idea that we would be tackling and forcing a greater level of responsibility on the web giants that have profited for so long with such a degree of irresponsibility to tackle issues of content that incites violence or violent extremism, content that induces a child to harm themselves or that sexually victimizes a child, content used to bully a child, and intimate content communicated without consent, all of those elements of the bill, we support in principle.

We look forward to a very rigorous examination at committee with the witnesses we need to bring forward. There is no doubt that there is a need for this bill and we need to proceed as quickly as possible, but only by hearing from the appropriate witnesses and making sure that we have gotten all the answers and made all the improvements necessary to this bill.

Online Harms ActGovernment Orders

June 7th, 2024 / 12:25 p.m.
See context

Bloc

Andréanne Larouche Bloc Shefford, QC

Mr. Speaker, it is not easy to speak in front of the member for Salaberry—Suroît, who does outstanding work and who just gave a wonderful speech. I will see what I can add to it. I may get a little more technical than she did. She spoke from the heart, as usual, and I commend her for that. I also want to thank her for her shout-out to Bill C-319. People are still talking to me about Bill C‑319, because seniors between the ages of 65 and 74 feel forgotten. We will continue this debate over the summer. In anticipation of this bill's eventual return before the House, we will continue to try to raise public awareness of the important issue of increasing old age security by 10% for all seniors.

I have gotten a bit off today's topic. I am the critic for seniors, but I am also the critic for status of women, and it is more in that capacity that I am rising today to speak to Bill C-63. This is an issue that I hear a lot about. Many groups reach out to me about hate speech. They are saying that women are disproportionately affected. That was the theme that my colleague from Drummond and I chose on March 8 of last year. We are calling for better control over hate speech out of respect for women who are the victims of serious violence online. It is important that we have a bill on this subject. It took a while, but I will come back to that.

Today we are discussing the famous Bill C‑63, the online harms act, “whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act”. This bill was introduced by the Minister of Justice. I will provide a bit of context. I will then talk a bit more about the bill. I will close with a few of the Bloc Québécois's proposals.

To begin, I would like to say that Bill C‑63 should have been introduced much sooner. The Liberals promised to legislate against online hate. As members know, in June 2021, during the second session of the 43rd Parliament, the Liberals tabled Bill C-36, which was a first draft that laid out their intentions. This bill faced criticism, so they chose to let it die on the Order Paper. In July 2021, the government launched consultations on a new regulatory framework for online safety. It then set up an expert advisory group to help it draft a new bill. We saw that things were dragging on, so in 2022 we again asked about bringing back the bill. We wanted the government to keep its promises. This bill comes at a time when tensions are high and discourse is strained, particularly because of the war between Israel and Hamas. Some activists fear that hate speech will be used to silence critics. The Minister of Justice defended himself by saying that the highest level of proof would have to be produced before a conviction could be handed down.

Second, I would like to go back over a few aspects of the bill. Under this bill, operators who refuse to comply with the law, or who refuse to comply with the commission's decision, could face fines of up to 8% of their overall gross revenues, or $25 million, the highest fine, depending on the nature of the offence. Bill C‑63 increases the maximum penalties for hate crimes. It even includes a definition of hate as the “emotion that involves detestation or vilification and that is stronger than disdain or dislike”. The bill addresses that. This legislation includes tough new provisions stipulating that a person who commits a hate-motivated crime, under any federal law, can be sentenced to life in prison. Even more surprising, people can file a complaint before a provincial court judge if they have reasonable grounds to suspect that someone is going to commit one of these offences.

Bill C-63 amends the Canadian Human Rights Act to allow the Canadian Human Rights Commission to receive complaints regarding the communication of hate speech. Individuals found guilty could be subject to an order. Private conversations are excluded from the communication of hate speech. There are all kinds of things like that to examine more closely. As my colleague explained, this bill contains several parts, each with its own elements. Certain aspects will need a closer look in committee.

Bill C-63 also updates the definition of “Internet service”. The law requires Internet service providers to “notify the law enforcement body designated by the regulations...as soon as feasible and in accordance with the regulations” if they have “reasonable grounds to believe that their Internet service is being or has been used to commit a child pornography offence”.

Bill C-63 tackles two major scourges of the digital world, which I have already discussed. The first is non-consensual pornographic material or child pornography, and the second is hate speech.

The provisions to combat child pornography and the distribution of non-consensual pornographic material are generally positive. The Bloc Québécois supports them. That is why the Bloc Québécois supports part 1 of the bill.

On the other hand, some provisions of Bill C‑63 to fight against hate are problematic. The Bloc Québécois fears, as my colleague from Salaberry—Suroît explained, that the provisions of Bill C‑63 might unnecessarily restrict freedom of expression. We want to remind the House that Quebec already debated the subject in 2015. Bill 59, which sought to counter radicalization, was intended to sanction hate speech. Ultimately, Quebec legislators concluded that giving powers to the Commission des droits de la personne et des droits de la jeunesse, as Bill C‑63 would have us do with the Canadian Human Rights Commission, would do more harm than good. The Bloc Québécois is going with the consensus in Quebec on this. It believes that the Criminal Code provisions are more than sufficient to fight against hate speech. Yes, the Bloc Québécois is representing the consensus in Quebec and reiterating it here in the House.

Third, the Bloc Québécois is proposing that Bill C‑63 be divided so that we can debate part 1 separately, as I explained. This is a critical issue. Internet pornography has a disproportionate effect on children, minors and women, and we need to protect them. This part targets sexual content. Online platforms are also targeted in the other parts.

We believe that the digital safety commission must be established as quickly as possible to provide support and recourse for those who are trying to have content about them removed from platforms. We have to help them. By dividing Bill C‑63, we would be able to debate and reach a consensus on part 1 more quickly.

Parts 2, 3 and 4 also contain provisions about hate speech. That is a bit more complex. Part 1 of the bill is well structured. It forces social media operators, including platforms that distribute pornographic material, such as Pornhub, to take measures to increase the security of digital environments. In order to do so, the bill requires social media operators to act responsibly. All of that is very positive.

Part 1 also talks about allowing users to report harmful content to operators based on seven categories defined by the law, so that it can be removed. We want Bill C-63 to be tougher on harmful content, meaning content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent. As we have already seen, this has serious consequences for victims with related PTSD. We need to take action.

However, part 2 of the bill is more problematic, because it amends the Criminal Code to increase the maximum sentences for hate crimes. The Bloc Québécois finds it hard to see how increasing maximum sentences for this type of crime will have any effect and how it is justified. Introducing a provision that allows life imprisonment for any hate-motivated federal offence is puzzling.

Furthermore, part 2 provides that a complaint can be made against someone when there is a fear they may commit a hate crime, and orders can be made against that person. However, as explained earlier, there are already sections of the Criminal Code that deal with these situations. This part is therefore problematic.

Part 3 allows an individual to file a complaint with the Canadian Human Rights Commission for speech that foments hate, including online speech. As mentioned, the Bloc Québécois has concerns that these provisions may be used to silence ideological opponents.

Part 4 states that Internet service providers must notify the appropriate authority if they suspect that their services are being used for child pornography purposes. In short, this part should also be studied.

In conclusion, the numbers are alarming. According to Statistics Canada, violent hate crimes have increased each year since 2015. Between 2015 and 2021, the total number of victims of violent hate crimes increased by 158%. The Internet is contributing to the surge in hate. However, if we want to take serious action, I think it is important to split Bill C‑63. The Bloc Québécois has been calling for this for a long time. Part 1 is important, but parts 2, 3 and 4 need to be studied separately in committee.

I would like to acknowledge all the work accomplished on this issue by my colleagues. Specifically, I am referring to the member for Drummond, the member for Rivière-du-Nord and the member for Avignon—La Mitis—Matane—Matapédia. We really must take action.

This is an important issue that the Bloc Québécois has been working on for a very long time.

Online Harms ActGovernment Orders

June 7th, 2024 / 12:15 p.m.
See context

Bloc

Claude DeBellefeuille Bloc Salaberry—Suroît, QC

Mr. Speaker, I have been authorized to share my time with the hon. member for Shefford, who does essential work for the Bloc Québécois on issues having to do with seniors. I would like to take this opportunity to remind the government that Bill C‑319, which was introduced by my colleague, was unanimously adopted in committee with good reason. The Bloc Québécois is proposing to increase the amount of the full pension by 10% starting at age 65 and change the way to guaranteed income supplement is calculated to benefit seniors.

There is a lot of talk about that in my riding. This bill is coming back to the House and the government should make a commitment at some point. We are asking the government to give royal assent to Bill C‑319. In other words, if the bill is blocked again, seniors will understand that the Liberals are once again abandoning them. I am passionate about the cause of seniors, and so I wanted to use my speech on Bill C‑63 to make a heartfelt plea on behalf of seniors in Quebec and to commend my colleague from Shefford for her work.

Today we are debating Bill C‑63, which amends a number of laws to tackle two major digital scourges, specifically child pornography, including online child pornography, and hate speech. This legislation was eagerly awaited. We were surprised that it took the government so long to introduce it.

We have been waiting a long time for this bill, especially part 1. The Bloc Québécois has been waiting a long time for such a bill to protect our children and people who are abused and bullied and whose reputations are jeopardized because of all the issues related to pornography. We agree with part 1 of the bill. We even made an offer to the minister. We agree with it so completely, and I believe there is a consensus about that across the House, that I think we should split the bill and pass the first part before the House rises. That way, we could implement everything needed to protect our children, teens and young adults who are currently going through difficult experiences that can change their lives and have a significant negative impact on them.

We agree that parts 2, 3 and 4 need to be discussed and debated, because the whole hate speech component of the bill is important. We agree with the minister on that. It is very important. What is currently happening on the Internet and online is unacceptable. We need to take action, but reaching an agreement on how to deal with this issue is not that easy. We need time and we need to debate it amongst ourselves.

The Bloc Québécois has a list of witnesses who could enlighten us on how we can improve the situation. We would like to hear from experts who could help us pass the best bill possible in order to protect the public, citizens and groups when it comes to the whole issue of hate speech. We also wonder why the minister, in part 2 of his bill, which deals with hate speech, omitted to include the two clauses of the bill introduced by the member for Beloeil—Chambly. I am talking about Bill C-367, which proposed removing the protection afforded under the Criminal Code to people who engage in hate speech on a religious basis.

We are wondering why the minister did not take the opportunity to add these clauses to his bill. These are questions that we have because to us, offering this protection is out of the question. It is out of the question to let someone use religion as an excuse to make gestures, accusations or even very threatening comments on the Internet under these sections of the Criminal Code. We are asking the minister to listen. The debates in the House and in committee are very polarized right now.

It would be extremely sad and very disappointing if we passed this bill so quickly that there was no time to debate it in order to improve it and make it the best bill it can be.

I can say that the Bloc Québécois is voting in favour of the bill at second reading. As I said, it is a complex bill. We made a proposal to the Prime Minister. We wrote to him and the leader. We also talked to the Minister of Justice to tell him to split the bill as soon as possible. That way, we could quickly protect the survivors who testified at the Standing Committee on Access to Information, Privacy and Ethics in the other Parliament. These people said that their life is unbearable, and they talked about the consequences they are suffering from being victims of sites such as Pornhub. They were used without their consent. Intimate images of them were posted without their consent. We are saying that we need to protect the people currently going through this by quickly adopting part 1. The committee could then study part 2 and hear witnesses.

I know that the member for Drummond and the member for Avignon—La Mitis—Matane—Matapédia raised this idea during committee of the whole on May 23. They tried to convince the minister, but he is still refusing to split the bill. We think that is a very bad idea. We want to repeat our offer. We do not really understand why he is so reluctant to do so. There is nothing partisan about what the Bloc Québécois is proposing. Our focus is on protecting victims on various platforms.

In closing, I know that the leaders are having discussions to finalize when the House will rise for the summer. Maybe fast-tracking a bill like this one could be part of the negotiations. However, I repeat that we are appealing to the Minister of Justice's sense of responsibility. I know he cares a lot about victims and their cause. We are sincerely asking him to postpone the passage of parts 2, 3 and 4, so that we can have more time to debate them in committee. Most importantly, we want to pass part 1 before the House rises for the summer so that we can protect people who are going through a really hard time right now because their private lives have been exposed online and they cannot get web platforms to taken down their image, their photo or photos of their private parts.

We are appealing to the minister's sense of responsibility.

Online Harms ActGovernment Orders

June 7th, 2024 / 10:45 a.m.
See context

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Mr. Speaker, third, the government must actually enforce laws that are already on the books but have not been recently enforced due to a extreme lack of political will and disingenuous politics and leadership, particularly as they relate to hate speech. This is particularly in light of the rise of dangers currently faced by vulnerable Canadian religious communities such as, as the minister mentioned, Canada's Jewish community.

This could be done via actions such as ensuring the RCMP, including specialized integrated national security enforcement teams and national security enforcement sections, is providing resources and working directly with appropriate provincial and municipal police forces to share appropriate information intelligence to provide protection to these communities, as well as making sure the secure security infrastructure program funding is accessible in an expedited manner so community institutions and centres can enhance security measures at their gathering places.

Fourth, for areas where modernization of existing regulations and the Criminal Code need immediate updating to reflect the digital age, and where there could be cross-partisan consensus, the government should undertake these changes in a manner that would allow for swift and non-partisan passage through Parliament.

These items could include some of the provisions discussed in Bill C-63. These include the duty of making content that sexually victimizes a child or revictimizes a survivor, or of intimate content communicated without consent, inaccessible to persons in Canada in certain circumstances; imposing certain duties to keep all records related to sexual victimization to online providers; making provisions for persons in Canada to make a complaint to existing enforcement bodies, such as the CRTC or the police, not a new bureaucracy that would take years to potentially materialize and be costly and/or ineffective; ensuring that content on a social media service that sexually victimizes a child or revictimizes a survivor, or is intimate content communicated without consent, by authorization of a court making orders to the operators of those services, is inaccessible to persons in Canada; and enforcing the proposed amendment to an act respecting the mandatory reporting of internet child pornography by persons who provide an Internet service.

Other provisions the government has chosen not to include in Bill C-63, but that should have been and that Parliament should be considering in the context of harms that are being conducted online, must include updating Canada's existing laws on the non-consensual distribution of intimate images to ensure the distribution of intimate deepfakes is also criminalized, likely through a simple update to the Criminal Code. We could have done this by unanimous consent today had the government taken the initiative to do so. This is already a major problem in Canada with girls in high schools in Winnipeg seeing intimate images of themselves, sometimes, as reports are saying, being sexually violated without any ability for the law to intervene.

The government also needs to create a new criminal offence of online criminal harassment that would update the existing crime of criminal harassment to address the ease and anonymity of online criminal harassment. Specifically, this would apply to those who repeatedly send threatening and/or explicit messages or content to people across the Internet and social media when they know, or should know, it is not welcome. This could include aggravating factors for repeatedly sending such material anonymously and be accompanied by a so-called digital restraining order that would allow victims of online criminal harassment to apply to a judge, under strict circumstances, to identify the harassment and end the harassment.

This would protect privacy, remove the onus on social media platforms from guessing when they should be giving identity to the police and prevent the escalation of online harassment into physical violence. This would give police and victims clear and easy-to-understand tools to prevent online harassment and associated escalation. This would address a major issue of intimate partner violence and make it easier to stop coercive control.

As well, I will note to the minister that members of the governing Liberal Party agreed to the need for these exact measures at a recent meeting of PROC related to online harassment of elected officials this past week.

Fifth, the government should consider a more effective and better way to regulate online platforms, likely under the authority of the CRTC and the Minister of Industry, to better protect children online while protecting charter rights.

This path could include improved measures to do this. This could include, through legislation, not backroom regulation, but precisely through law, defining the duty of care required by online platforms. Some of these duties of care have already been mentioned in questions to the ministers today. This is what Parliament should be seized with, not allowing some unnamed future regulatory body to decide this for us while we have big tech companies and their lobbying arms defining that behind closed doors. That is our job, not theirs.

We could provide parents with safeguards, controls and transparency to prevent harm to their kids when they are online, which could be part of the duty of care. We could also require that online platforms put the interests of children first with appropriate safeguards, again, in a legislative duty of care.

There could also be measures to prevent and mitigate self-harm, mental health disorders, addictive behaviours, bullying and harassment, sexual violence and exploitation, and the promotion of marketing and products that are unlawful for minors. All of these things are instances of duty of care.

We could improve measures to implement privacy-preserving and trustworthy age verification methods, which many platforms always have the capacity to do, while prohibiting the use of a digital ID in any of these mechanisms.

This path could also include measure to ensure that the enforcement of these mechanisms, including a system of administrative penalties and consequences, is done through agencies that already exist. Additionally, we could ensure that there are perhaps other remedies, such as the ability to seek remedy for civil injury, when that duty of care is violated.

This is a non-comprehensive list of online harms, but the point is, we could come to consensus in this place on simple modernization issues that would update the laws now. I hope that the government will accept this plan.

A send out a shout-out to Sean Phelan and David Murray, two strong and mighty workers. We did not have an army of bureaucrats, but we came up with this. I hope that Parliament considers this alternative plan, instead of Bill C-63, because the safety of Canadians is at risk.

Online Harms ActGovernment Orders

June 7th, 2024 / 10:30 a.m.
See context

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Mr. Speaker, we must protect Canadians in the digital age, but Bill C-63 is not the way to do it. It would force Canadians to make unnecessary trade-offs between the guarantee of their security and their charter rights. Today I will explain why Bill C-63 is deeply flawed and why it would not protect Canadians' rights sufficiently. More importantly, I will present a comprehensive alternative plan that is more respectful of Canadians' charter rights and would provide immediate protections for Canadians facing online harms.

The core problem with Bill C-63 is how the government has changed and chosen to frame the myriad harms that occur in the digital space as homogenous and as capable of being solved with one approach or piece of legislation. In reality, harms that occur online are an incredibly heterogenous set of problems requiring a multitude of tailored solutions. It may sound like the former might be more difficult to achieve than the latter, but this is not the case. It is relatively easy to inventory the multitudes of problems that occur online and cause Canadians harm. From there, it should be easy to sort out how existing laws and regulatory processes that exist for the physical world could be extended to the digital world.

There are few, if any, examples of harms that are being caused in digital spaces that do not already have existing relatable laws or regulatory structures that could be extended or modified to cover them. Conversely, what the government has done for nearly a decade is try to create new, catch-all regulatory, bureaucratic and extrajudicial processes that would adapt to the needs of actors in the digital space instead of requiring them to adapt to our existing laws. All of these attempts have failed to become law, which is likely going to be the fate of Bill C-63.

This is a backward way of looking at things. It has caused nearly a decade of inaction on much-needed modernization of existing systems and has translated into law enforcement's not having the tools it needs to prevent crime, which in turn causes harm to Canadians. It has also led to a balkanization of laws and regulations across Canadian jurisdictions, a loss of investment due to the uncertainty, and a lack of coordination with the international community. Again, ultimately, it all harms Canadians.

Bill C-63 takes the same approach by listing only a few of the harms that happen in online spaces and creates a new, onerous and opaque extrajudicial bureaucracy, while creating deep problems for Canadian charter rights. For example, Bill C-63 would create a new “offence motivated by a hatred” provision that could see a life sentence applied to minor infractions under any act of Parliament, a parasitic provision that would be unchecked in the scope of the legislation. This means that words alone could lead to life imprisonment.

While the government has attempted to argue that this is not the case, saying that a serious underlying act would have to occur for the provision to apply, that is simply not how the bill is written. I ask colleagues to look at it. The bill seeks to amend section 320 of the Criminal Code, and reads, “Everyone who commits an offence under this Act or any other Act of Parliament...is guilty of an indictable offence and liable to imprisonment for life.”

At the justice committee earlier this year, the minister stated:

...the new hate crime offence captures any existing offence if it was hate-motivated. That can run the gamut from a hate-motivated theft all the way to a hate-motivated attempted murder. The sentencing range entrenched in Bill C-63 was designed to mirror the existing...options for all of these potential underlying offences, from the most minor to the most serious offences on the books....

The minister continued, saying, “this does not mean that minor offences will suddenly receive...harsh sentences. However, sentencing judges are required to follow legal principles, and “hate-motivated murder will result in a life sentence. A minor infraction will...not result in it.”

In this statement, the minister admitted both that the new provision could be applied to any act of Parliament, as the bill states, and that the government would be relying upon the judiciary to ensure that maximum penalties were not levelled against a minor infraction. Parliament cannot afford the government to be this lazy, and by that I mean not spelling out exactly what it intends a life sentence to apply to in law, as opposed to handing a highly imperfect judiciary an overbroad law that could have extreme, negative consequences.

Similarly, a massive amount of concern from across the political spectrum has been raised regarding Bill C-63's introduction of a so-called hate crime peace bond, calling it a pre-crime provision for speech. This is highly problematic because it would explicitly extend the power to issue peace bonds to crimes of speech, which the bill does not adequately define, nor does it provide any assurance that it would meet a criminal standard for hate.

Equally as concerning is that Bill C-63 would create a new process for individuals and groups to complain to the Canadian Human Rights Commission that online speech directed at them is discriminatory. This process would be extrajudicial, not subject to the same evidentiary standards of a criminal court, and could take years to resolve. Findings would be based on a mere balance of probabilities rather than on the criminal standard of proof beyond a reasonable doubt.

The subjectivity of defining hate speech would undoubtedly lead to punishments for protected speech. The mere threat of human rights complaints would chill large amounts of protected speech, and the system would undoubtedly be deluged with a landslide of vexatious complaints. There certainly are no provisions in the bill to prevent any of this from happening.

Nearly a decade ago, even the Toronto Star, hardly a bastion of Conservative thought, wrote a scathing opinion piece opposing these types of provisions. The same principle should apply today. When the highly problematic components of the bill are overlaid upon the fact that we are presently living under a government that unlawfully invoked the Emergencies Act and that routinely gaslights Canadians who legitimately question efficacy or the morality of its policies as spreading misinformation, as the Minister of Justice did in his response to my question, saying that I had mis-characterized the bill, it is not a far leap to surmise that the new provision has great potential for abuse. That could be true for any political stripe that is in government.

The government's charter compliance statement, which is long and vague and has only recently been issued, should raise concerns for parliamentarians in this regard, as it relies on this statement: “The effects of the Bill on freedom expression are outweighed by the benefits of protecting members of vulnerable groups”. The government has already been found to have violated the Charter in the case of Bill C-69 for false presumptions on which one benefit outweighs others. I suspect this would be the same case for Bill C-63 should it become law, which I hope it does not.

I believe in the capacity of Canadians to express themselves within the bounds of protected speech and to maintain the rule of law within our vibrant pluralism. Regardless of political stripe, we must value freedom of speech and due process, because they are what prevents violent conflict. Speech already has clearly defined limitations under Canadian law. The provisions in Bill C-63 that I have just described are anathema to these principles. To be clear, Canadians should not be expected to have their right to protected speech chilled or limited in order to be safe online, which is what Bill C-63 would ask of them.

Bill C-63 would also create a new three-headed, yet-to-exist bureaucracy. It would leave much of the actual rules the bill describes to be created and enforced under undefined regulations by said bureaucracy at some much later date in the future. We cannot wait to take action in many circumstances. As one expert described it to me, it is like vaguely creating an outline and expecting bureaucrats, not elected legislators, to colour in the picture behind closed doors without any accountability to the Canadian public.

The government should have learned from the costs associated with failing when it attempted the same approach with Bill C-11 and Bill C-18, but alas, here we are. The new bureaucratic process would be slow, onerous and uncertain. If the government proceeds with it, it means Canadians would be left without protection, and innovators and investors would be left without the regulatory certainty needed to grow their businesses.

It would also be costly. I have asked the Parliamentary Budget Officer to conduct an analysis of the costs associated with the creation of the bureaucracy, and he has agreed to undertake the task. No parliamentarian should even consider supporting the bill without understanding the resources the government intends to allocate to the creation of the new digital safety commission, digital safety ombudsman and digital safety office, particularly since the findings in this week's damning NSICOP report starkly outlined the opportunity cost of the government failing to allocate much needed resources to the RCMP.

Said differently, if the government cannot fund and maintain the critical operations of the RCMP, which already has the mandate to enforce laws related to public safety, then Parliament should have grave, serious doubts about the efficacy of its setting up three new bureaucracies to address issues that could likely be managed by existing regulatory bodies like the CRTC or in the enforcement of the Criminal Code. Also, Canadians should have major qualms about creating new bureaucracies which would give power to well-funded and extremely powerful big tech companies to lobby and manipulate regulations to their benefit behind the scenes and outside the purview of Parliament.

This approach would not necessarily protect Canadians and may create artificial barriers to entry for new innovative industry players. The far better approach would be to adapt and extend long-existing laws and regulatory systems, properly resource their enforcement arms, and require big tech companies and other actors in the digital space to comply with these laws, not the other way around. This approach would provide Canadians with real protections, not what amounts to a new, ineffectual complaints department with a high negative opportunity cost to Canadians.

In no scenario should Parliament allow the government to entrench in legislation a power for social media companies to be arbiters of speech, which Bill C-63 risks doing. If the government wishes to further impose restrictions on Canadians' rights to speech, that should be a debate for Parliament to consider, not for regulators and tech giants to decide behind closed doors and with limited accountability to the public.

In short, this bill is completely flawed and should be abandoned, particularly given the minister's announcement this morning that he is unwilling to proceed with any sort of change to it in scope.

However, there is a better way. There is an alternative, which would be a more effective and more quickly implementable plan to protect Canadians' safety in the digital age. It would modernize existing laws and processes to align with digital advancements. It would protect speech not already limited in the Criminal Code, and would foster an environment for innovation and investment in digital technologies. It would propose adequately resourcing agencies with existing responsibilities for enforcing the law, not creating extrajudicial bureaucracies that would amount to a complaints department.

To begin, the RCMP and many law enforcement agencies across the country are under-resourced after certain flavours of politicians have given much more than a wink and a nod to the “defund the police” movement for over a decade. This trend must immediately be reversed. Well-resourced and well-respected law enforcement is critical to a free and just society.

Second, the government must also reform its watered-down bail policies, which allow repeat offenders to commit crimes over and over again. Criminals in the digital space will never face justice, no matter what laws are passed, if the Liberal government's catch-and-release policies are not reversed. I think of a woman in my city of Calgary who was murdered in broad daylight in front of an elementary school because her spouse was subject to the catch-and-release Liberal bail policy, in spite of his online harassment of her for a very long time.

Third, the government must actually enforce—

Online Harms ActGovernment Orders

June 7th, 2024 / 10:20 a.m.
See context

Bloc

Claude DeBellefeuille Bloc Salaberry—Suroît, QC

Mr. Speaker, the Bloc Québécois believes that Bill C-63 tackles two major online scourges and that it is time for us, as legislators, to take action to stamp them out.

The Bloc Québécois strongly supports part 1 of the bill, in other words, all provisions related to addressing child pornography and the communication of pornographic content without consent. As we see it, this part is self-evident. It has garnered such strong consensus that we told the minister, through our critic, the member for Rivière-du-Nord, that we not only support it, but we were also prepared to accept and pass part 1 quickly and facilitate its passage.

As for part 2, however, we have some reservations. We consider it reasonable to debate this part in committee. The minister can accuse other political parties of playing politics with part 2, but not the Bloc Québécois. We sincerely believe that part 2 needs to be debated. We have questions. We have doubts. I think our role calls on us to to get to the bottom of things.

That is why we have asked the minister—and why we are asking him again today—to split Bill C‑63 in two, so that we can pass part 1 quickly and implement it, and set part 2 aside for legislative and debate-related purposes.

Online Harms ActGovernment Orders

June 7th, 2024 / 10 a.m.
See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

moved that Bill C-63, An Act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts, be read the second time and referred to a committee.

Mr. Speaker, hon. colleagues, I am very pleased today to speak to Bill C-63, the online harms act. I speak today not only as a minister and as a fellow parliamentarian, but also as a father, as a South Asian and as a Muslim Canadian.

There are a few moments in this place when our work becomes very personal, and this is one such moment for me. Let me explain why. I ran for office for a number of reasons in 2015. Chief among them was to fight against discrimination and to fight for equality in what I viewed as an increasingly polarized world. In recent years, we have seen that polarization deepen and that hatred fester, including at home here in Canada.

I would never have fathomed that in 2024, Canada would actually lead the G7 in the number of deaths attributable to Islamophobia. Among our allies, it is Canada that has experienced the most fatal attacks against Muslims in the G7. There have been 11. Those were 11 preventable deaths. I say “preventable” because in the trials of both the Quebec mosque shooter, who murdered six men on January 29, 2017, and the man who murdered four members of the Afzaal family in London, Ontario, the attackers admitted, in open court, to having been radicalized online. They admitted what so many of us have always known to be the case: Online hatred has real-world consequences.

Yesterday was the third anniversary of the attack on the Afzaal family, an attack described by the presiding judge as “a terrorist act”. In memory of Talat, Salman, Yumna and Madiha, who lost their lives to an act of hatred on June 6, 2021, we are taking action.

Bill C-63, the online harms act, is a critical piece of that action. This bill is the product of years of work.

We held consultations for over four years. We talked to victims' groups, advocacy groups, international partners, people from the technology industry and the general public. We organized a nationwide consultation and held 19 national and regional round tables. We published a report about what we learned. We listened to the recommendations of our expert advisory group on online safety, a diverse think tank made up of experts who are respected across Canada. We were given valuable advice and gained a great deal of knowledge thanks to those consultations, and all of that informed the development of Bill C-63.

Many of our international partners, such as the United Kingdom, Australia, Germany, France and the European Union, have already done considerable legislative work to try to limit the risks of harmful content online. We learned from their experience and adapted the best parts of their most effective plans to the Canadian context.

We have also learned what did not work abroad, like the immediate takedown of all types of harmful content, originally done in Germany; or like the overbroad restriction on freedom of speech that was struck as unconstitutional in France. We are not repeating those errors here. Our approach is much more measured and reflects the critical importance of constitutionally protected free expression in Canada's democracy. What we learned from this extensive consultation was that the Internet and social media platforms can be a force for good in Canada and around the world. They have been a tool for activists to defend democracy. They are platforms for critical expression and for critical civic discourse. They make learning more accessible to everyone.

The Internet has made people across our vast world feel more connected to one another, but the internet also has a dark side. Last December, the RCMP warned of an alarming spike in online extremism among young people in Canada and the radicalization of youth online. We know that the online environment is especially dangerous for our most vulnerable. A recent study by Plan International found that 58% of girls have experienced harassment online.

Social media platforms are used to exploit and disseminate devastating messages with tragic consequences. This is because of one simple truth. For too long, the profits of platforms have come before the safety of users. Self-regulation has failed to keep our kids safe. Stories of tragedy have become far too common. There are tragic consequences, like the death of Amanda Todd, a 15-year-old Port Coquitlam student who died by suicide on October 10, 2012, after being exploited and extorted by more than 20 social media accounts. This relentless harassment started when Amanda was just 12 years old, in grade 7.

There was Carson Cleland last fall. He was the same age as my son at the time: 12 years old. Carson made a mistake. He shared an intimate image with someone whom he thought was a friend online, only to find himself caught up in a web of sextortion from which he could not extricate himself. Unable to turn to his parents, too ashamed to turn to his friends, Carson turned on himself. Carson is no longer with us, but he should be with us.

We need to do more to protect the Amanda Todds and the Carson Clelands of this country, and with this bill, we will. I met with the incredible people at the Canadian Centre for Child Protection earlier this year, and they told me that they receive 70 calls every single week from scared kids across Canada in situations like Amanda's and like Carson's.

As the father of two youngsters, this is very personal for me. As they grow up, my 10-year-old and 13-year-old boys spend more and more time on screens. I know that my wife and I are not alone in this parenting struggle. It is the same struggle that parents are facing around the country.

At this point, there is no turning back. Our children and teens are being exposed to literally everything online, and I feel a desperate need, Canadians feel a desperate need, to do a better job of protecting those kids online. That is precisely what we are going to do with this bill.

Bill C-63 is guided by four important objectives. It aims to reduce exposure to harmful content online, to empower and support users. Second, it would address and denounce the rise in hatred and hate crimes. Third, it would ensure that victims of hate have recourse to improved remedies, and fourth, it would strengthen the reporting of child sexual abuse material to enhance the criminal justice response to this heinous crime.

The online harms act will address seven types of harmful content based on categories established over more than four years of consultation.

Not all harms will be treated the same. Services will be required to quickly remove content that sexually victimizes a child or that revictimizes a survivor, as well as to remove what we call “revenge porn”, including sexual deepfakes. There is no place for this material on the Internet whatsoever.

For other types of content, like content that induces a child to self-harm or material that bullies a child, we are placing a duty on platforms to protect children. This means a new legislative and regulatory framework to ensure that social media platforms reduce exposure to harmful, exploitative content on their platforms. This means putting in place special protections for children. It also means that platforms will have to make sure that users have the tools and the resources they need to report harmful content.

To fulfill the duty to protect children, social media platforms will have to integrate age-appropriate design features to make their platforms safer for children to use. This could mean defaults for parental controls and warning labels for children. It could mean security settings for instant messaging for children, or it could mean safe-search settings.

Protecting our children is one of our most important duties that we undertake as lawmakers in this place. As a parent, it literally terrifies me that the most dangerous toys in my home, my children's screens, are not subject to any safety standards right now. This needs to change, and it would change with the passage of Bill C-63.

It is not only that children are subject to horrible sexual abuse and bullying online, but also that they are exposed to hate and hateful content, as are Internet users of all ages and all backgrounds, which is why Bill C-63 targets content that foments hatred and incitements to violence as well as incitements to terrorism. This bill would not require social media companies to take down this kind of harmful content; instead, the platforms would have to reduce exposure to it by creating a digital safety plan, disclosing to the digital safety commissioner what steps they are putting in place to reduce risk and reporting back on their progress.

The platforms would also be required to give users practical options for recourse, like tools to either flag or block certain harmful material from their own feeds. This is key to ensuring community safety, all the more so because they are backed by significant penalties for noncompliance. When I say “significant”, the penalties would be 6% of global revenue or $10 million, whichever is higher, and in the instance of a contravention of an order from the digital safety commission, those would rise to 8% of global revenue or $25 million, again, whichever is higher.

The online harms act is an important step towards a safer, more inclusive online environment, where social media platforms actively work to reduce the risk of user exposure to harmful content on their platforms and help to prevent its spread, and where, as a result, everyone in Canada can feel safer to express themselves openly. This is critical, because at the heart of this initiative, it is about promoting expression and participation in civic discourse that occurs online. We can think about Carla Beauvais and the sentiments she expressed when she stood right beside me when we tabled this legislation in February, and the amount of abuse she faced for voicing her concerns about the George Floyd incident in the United States, which cowered her and prevented her from participating online. We want her voice added to the civic discourse. Right now, it has been removed.

The online harms act will regulate social media services, the primary purpose of which is to enable users to share publicly accessible content, services that pose the greatest risk of exposing the greatest number of people to harmful content.

This means that the act would apply to social media platforms, such as Facebook, X and Instagram; user-uploaded adult content services, such as Pornhub; and livestreaming services, such as Twitch. However, it would not apply to any private communications, meaning private texts or direct private messaging on social media apps, such as Instagram or Facebook Messenger. It is critical to underscore, again, that this is a measured approach that does not follow the overreach seen in other countries we have studied, in terms of how they embarked upon this endeavour. The goal is to target the largest social media platforms, the places where the most people in Canada are spending their time online.

Some ask why Bill C-63 addresses both online harms and hate crimes, which can happen both on and off-line. I will explain this. Online dangers do not remain online. We are seeing a dramatic rise in hate crime across our country. According to Statistics Canada, the number of police-reported hate crimes increased by 83% between 2019 and 2022. B'nai Brith Canada reports an alarming 109% increase in anti-Semitic incidents from 2022 to 2023. In the wake of October 7, 2023, I have been hearing frequently from Jewish and Muslim groups, which are openly questioning whether it is safe to be openly Jewish or Muslim in Canada right now. This is not tenable. It should never be tolerated, yet hate-motivated violence keeps happening. People in Canada are telling us to act. It is up to us, as lawmakers, to do exactly that.

We must take concrete action to better protect all people in Canada from harms, both online and in our communities. We need better tools to deal with harmful content online that foments violence and destruction. Bill C-63 gives law enforcement these much-needed tools.

The Toronto Police Service has expressed their open support of Bill C-63 because they know it will make our communities safer. Members of the Afzaal family have expressed their open support for Bill C-63 because they know the Islamophobic hate that causes someone to kill starts somewhere, and it is often online.

However, we know there is no single solution to the spread of hatred on and off-line. That is why the bill proposes a number of different tools to help stop the hate. It starts with the Criminal Code of Canada. Bill C-63 would amend the Criminal Code to better target hate crime and hate propaganda. It would do this in four important ways.

First, it would create a new hate crime offence. Law enforcement has asked us for this tool, so they can call a hate crime a hate crime when laying a charge, rather than as an afterthought at sentencing. This new offence will also help law enforcement track the actual number of hate-motivated crimes in Canada. That is why they have appealed to me to create a free-standing hate crime offence in a manner that replicates what already exists in 47 of the 50 states south of the border. A hate-motivated assault is not just an assault. It is a hate crime and should be recognized as such on the front end of a prosecution.

Second, Bill C‑63 would increase sentences for the four existing hate speech offences. These are serious offences, and the sentences should reflect that.

Third, Bill C-63 would create a recognizance to keep the peace, which is specifically designed to prevent any of the four hate propaganda offences and the new hate crime offence from being committed.

This would be modelled on existing peace bonds, such as those used in domestic violence cases, and would require someone to have a reasonable fear that these offences would be committed. The threshold of “reasonable fear” is common to almost all peace bonds.

In addition, as some but not all peace bonds do, this would require the relevant attorney general to give consent before an application is made to a judge to impose a peace bond on a person. This ensures an extra layer of scrutiny in the process.

Finally, the bill would codify a definition of hatred for hate propaganda offences and for the new hate crime offence, based on the definition the Supreme Court of Canada created in its seminal decisions in R. v. Keegstra and in Saskatchewan Human Rights Commission v. Whatcott. The definition sets out not only what hatred is but also what it is not, thereby helping Canadians and law enforcement to better understand the scope of these offences.

The court has defined hate speech as content that expresses detestation or vilification of an individual or group on the basis of grounds such as race, national or ethnic origin, religion and sex. It only captures the most extreme and marginal type of expression, leaving the entirety of political and other discourse almost untouched. That is where one will find the category of content that some have called “awful but lawful”. This is the stuff that is offensive and ugly but is still permitted as constitutionally protected free expression under charter section 2(b). This category of content is not hate speech under the Supreme Court's definition.

I want to make clear what Bill C‑63 does not do. It does not undermine freedom of expression. It strengthens freedom of expression by allowing all people to participate safely in online discussions.

Bill C-63 would provide another tool as well. It would amend the Canadian Human Rights Act to define a new discriminatory practice of communicating hate speech online. The legislation makes clear that hate does not encompass content that merely discredits, humiliates, hurts or offends, but where hate speech does occur, there would be a mechanism through which an individual could ask that those expressions of hate be removed. The CHRA amendments are not designed to punish anyone. They would simply give Canadians a tool to get hate speech removed.

Finally, Bill C-63 would modernize and close loopholes in the mandatory reporting act. This would help law enforcement more effectively investigate child sex abuse and exploitation and bring perpetrators to justice, retaining information longer and ensuring that social media companies report CSAM to the RCMP.

There is broad support for the online harms act. When I introduced the legislation in February, I was proud to have at my side the Centre for Israel and Jewish Affairs and the National Council of Canadian Muslims. Those two groups have had vast differences in recent months, but on the need to fight hatred online, they are united. The same unity has been expressed by both Deborah Lyons, the special envoy on preserving Holocaust remembrance and combatting anti-Semitism, and Amira Elghawaby, the special representative on combatting Islamophobia.

The time to combat all forms of online hate is now. Hatred that festers online can result in real-world violence. I am always open to good-faith suggestions on how to improve the bill. I look forward to following along with the study of the legislation at the committee stage. I have a fundamental duty to uphold the charter protection of free expression and to protect all Canadians from harm. I take both duties very seriously.

Some have urged me to split Bill C-63 in two, dealing only with the provisions that stop sexually exploitative material from spreading and throwing away measures that combat hate. To these people, I say that I would not be doing my job as minister if I failed to address the rampant hatred on online platforms. It is my job to protect all Canadians from harm. That means kids and adults. People are pleading for relief from the spread of hate. It is time we acted.

Bill C-63 is a comprehensive response to online harms and the dangerous hate we are seeing spreading in our communities. We have a duty to protect our children in the real world. We must take decisive action to protect them online as well, where the dangers can be just as pernicious, if not more so. Such action starts with passing Bill C-63.

Business of the HouseOral Questions

June 6th, 2024 / 3:20 p.m.
See context

Gatineau Québec

Liberal

Steven MacKinnon LiberalLeader of the Government in the House of Commons

Mr. Speaker, there is indeed a secret in the House, and that is the Conservative Party's true intentions when it comes to cuts. “Chop, chop, chop,” as my colleague from Gaspésie—Les Îles-de-la-Madeleine so aptly puts it. That party wants to cut social programs and the programs that are so dear to Quebeckers and Canadians: women's rights, the right to abortion, the right to contraception. The Conservatives want to scrap our government's dental care and pharmacare plans. The secret is the Conservative Party's hidden agenda, which will do great harm to all Canadians.

With our government's usual transparency, this evening we will proceed to report stage consideration of Bill C-20, an act establishing the public complaints and review commission and amending certain acts and statutory instruments, and Bill C-40, an act to amend the Criminal Code, to make consequential amendments to other acts and to repeal a regulation regarding miscarriage of justice reviews, also known as David and Joyce Milgaard's law.

Tomorrow, we will begin second reading of Bill C-63, an act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other acts.

I would like to inform the House that next Monday and Thursday shall be allotted days. On Tuesday, we will start report stage of Bill C-69, the budget implementation act. On Wednesday, we will deal with Bill C-70, concerning foreign interference, as per the special order adopted last Thursday. I wish all members and the House staff a good weekend.

June 3rd, 2024 / 12:10 p.m.
See context

As an Individual

Ali Islam

What I know about Bill C-63 I've heard through the media. I haven't read the bill myself.

June 3rd, 2024 / 12:10 p.m.
See context

Liberal

Sameer Zuberi Liberal Pierrefonds—Dollard, QC

Thank you so much. The liaison idea is a great idea.

I'd like to move on to Mr. Ali Islam.

You referenced a really important piece of legislation, Bill C-63, which is the online harms act. I'm quite preoccupied with the online space. I've noticed on my own social media that when I post about certain issues, there's a lot of trolling. I don't know if it's from trolls in particular or bot farms, but I have a lot of trolling and a lot of it's hateful. I can't delete the middle fingers. I can't delete the hateful comments towards other identifiable groups.

I'm wondering if you think, from your knowledge, that Bill C-63, the online harms act, will help address the issue of misinformation, bots and hate that's being spewed online.

May 27th, 2024 / 6:55 p.m.
See context

Conservative

Garnett Genuis Conservative Sherwood Park—Fort Saskatchewan, AB

Thank you, Chair.

Thank you to our witnesses.

We're obviously familiar with the Liberal government's position on this bill. With respect to the officials, of course you're in a position to support that position. Your role as an official is not to come here and state your disagreement with government policy, even if you might privately disagree with government policy.

I will just say that I think that many of the arguments you put forward were clearly refuted by the senator already. I also want to say that I think Bill C-63 is a real disaster. It raises actual censorship issues. It has nothing on age verification. It's far, far broader than Bill S-210 at every level. It's enforced by vaguely empowered bureaucratic agencies and it includes dealing with speech.

Most Canadians who have seen what your government did.... To be fair, I understand your role as a non-partisan public servant, tasked with providing fearless advice and faithful implementation. However, what the Liberal government has put forward in Bill C-63 is not being well received across the board.

On the issues with section 171, I'm looking at the Criminal Code and trying to understand the argument here.

We have one definition of sexually explicit material in the Criminal Code. Implicitly, it's being suggested that maybe we could have multiple different definitions of sexually explicit material operating at the same time. However, it seems eminently logical that you would have one definition that relies on the existing jurisprudence.

As Mr. Bittle has suggested that if this definition covers the Game of Thrones, then it's already a problem because it already violates the Criminal Code if, in the commission of another offence, you were to show a child that material. Therefore, you already could run afoul of the Criminal Code if you put on Game of Thrones in your home for your 16-year-old. That's not happening. No one's getting arrested and going to jail because they let their 16-year-old watch Game of Thrones. If that's not happening already off-line, then maybe that suggests that this extensive reinterpretation of what the existing law already says is a little bit exaggerated.

In this context, we also know that Pornhub has been represented by a well-connected Liberal lobbyist who has met with Liberals in the lead-up to the vote.

I want to ask the Privacy Commissioner about what he said in terms of potential amendments.

How would this apply on social media? I'm going to just pose the question. I have young children. I obviously don't want them accessing the major, well-known pornography websites. I also don't want them seeing pornographic material on any other website that they might go to for a legitimate purpose. Therefore, if my children are on social media—they're not—or if they were on another website, if they were watching a YouTube video on that, whatever it was, I would want to ensure that 6-, 7-, 8-, 9-, 10-, 11- and 12-year-olds were not accessing pornography, regardless of the platform and regardless of the percentage of that company's overall business model.

I don't really understand philosophically why it would make sense or protect anyone's privacy to have an exemption for sites where it's just a small part of what they do, because if the point is to protect children, then the point is to protect children wherever they are.

I'd be curious for your response to that.

May 27th, 2024 / 6:45 p.m.
See context

Associate Assistant Deputy Minister, Cultural Affairs, Department of Canadian Heritage

Owen Ripley

Thank you for those questions.

In the sense that the purpose of Bill C‑63 is to promote online safety and reduce harm, the duty to protect children, which is referred to in section 64 of the proposed act, is quite flexible. According to the proposed section, “an operator has a duty, in respect of a regulated service that it operates, to protect children by complying with section 65.” Section 66 of the proposed act gives the commission the power to establish a series of duties or measures that must be incorporated into the service.

According to the government, the proposed act provides the flexibility needed to better protect children on social media. During the consultations, it is certainly legitimate to wonder whether the appropriate response is to require some services to adopt age verification. Once again, there will be a specialized regulator with the necessary expertise. In addition, there are mechanisms to consult civil society and experts to ensure that these decisions are well-thought-out.

May 27th, 2024 / 6:45 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

You also have concerns about the protection of privacy and personal information.

Comparisons are often made with Bill C‑63, but in my opinion, the two are quite different. Bill C‑63 aims to protect children from harmful online content, which is commendable. Bill S‑210 seeks to limit access to pornography.

The regulator you want to create through Bill C‑63 seems as though it could be very effective in playing that kind of role. The digital safety commission could play the same role as commissions in other countries. The same goes for the age verification processes.

Can you tell us what concerns you have regarding privacy, as well as any other concerns?

May 27th, 2024 / 6:40 p.m.
See context

Liberal

Chris Bittle Liberal St. Catharines, ON

Thank you so much.

I don't have much time, but perhaps I could turn to Mr. Ripley for him to expand on Bill C-63, the online harms act, with respect to what the government is intending to do to protect individuals from harms that are on the Internet.

May 27th, 2024 / 6:25 p.m.
See context

Owen Ripley Associate Assistant Deputy Minister, Cultural Affairs, Department of Canadian Heritage

Mr. Chair, thank you for inviting me to discuss Bill S‑210. As the associate assistant deputy minister for cultural affairs at the Department of Canadian Heritage, I will be responsible for the Online Harms Act that is being proposed as part of Bill C‑63.

While Bill C‑63 was being drafted, the department heard directly from experts, survivors from civil society and members of the public on what should be done to combat the proliferation of harmful content online.

A common theme emerged from these consultations: the vulnerability of children online and the need to take proactive measures to protect them. With this in mind, the future online harms act proposes a duty to protect children, which will require platforms to incorporate age-appropriate design features for children. Bill C‑63 also proposes a specialized regulatory authority that will have the skills and expertise to develop regulations, guidance and codes of practice, in consultation with experts and civil society.

Bill S-210 seeks to achieve a similarly admirable goal of protecting children online. However, the bill is highly problematic for a number of reasons, including a scope that is much too broad in terms of regulated services, as well as regulated content; possible risk to Canadians' privacy, especially considering the current state of age-verification frameworks internationally; structural incoherence that seems to mix criminal elements with regulatory elements; a troubling dependence on website blocking as the primary enforcement mechanism; and a lack of clarity around implementation and an unrealistic implementation timeline.

I'll briefly unpack a few of these concerns in greater detail.

As drafted, Bill S-210 would capture a broad range of websites and services that make sexually explicit material available on the Internet for commercial purposes, including search engines, social media platforms, streaming and video-on-demand applications, and Internet service providers. Moreover, the bill's definition of sexually explicit material is not limited to pornography but instead extends to a broader range of mainstream entertainment content with nudity or sex scenes, including content that would be found on services like Netflix, Disney+, or CBC Gem. Mandating age-verification requirements for this scope of services and content would have far-reaching implications for how Canadians access and use the Internet.

While efforts are under way globally in other jurisdictions to develop and prescribe age-verification technologies, there is still a lack of consensus that they are sufficiently accurate and sufficiently privacy-respecting. For example, France and Australia remain concerned that the technology is not yet sufficiently mature, and the testing of various approaches is ongoing. Over the next couple of years, the U.K. will ultimately require age assurance for certain types of services under its Online Safety Act. Ofcom is currently consulting on the principles that should guide the rollout of these technologies. However, the requirement is not yet in force, and services do not yet have to deploy age assurance at scale. In jurisdictions that have already moved ahead, such as certain U.S. states or Germany, there continue to be questions about privacy, effectiveness and overall compliance.

In short, these international examples show that mandates regarding age verification or age assurance are still a work in progress. There is also no other jurisdiction proposing a framework comparable in scope to Bill S-210. Website blocking remains a highly contentious enforcement instrument that poses a range of challenges and could impact Canadians' freedom of speech and Canada's commitment to an open and free Internet and to net neutrality.

I want to state once again that the government remains committed to better protecting children online. However, the government feels that the answer is not to prescribe a specific technology that puts privacy at risk and violates our commitment to an open Internet. It is critical that any measures developed to achieve this goal create a framework for protecting children online that is both flexible and well-informed.

Thank you for your attention. I look forward to any questions you may have.

May 27th, 2024 / 5:30 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

I only have a few seconds left, but I want to hear your thoughts on the fact that, according to the government, we don't need Bill S-210, since there's Bill C-63. To my knowledge, they're not the same at all. Bill C‑63 is extremely important, to be sure, but it's not identical to Bill S‑210. Do you share that opinion?

May 27th, 2024 / 5:15 p.m.
See context

Senator, Quebec, ISG

Julie Miville-Dechêne

I have to say, first of all, that Bill C-63 doesn't talk about age verification. There's nothing in this bill about age verification—the words are not even used—and there's very little on pornography. Bill C-63 talks about the very vague concept of “age appropriate design”. It says there should be age-appropriate design; I'm sorry, but age-appropriate design is not age verification. It could be at some point if a committee so decides, but it's not in the bill. That's the first thing I wanted to say.

Regarding privacy, we have laws in Canada. Why would—

May 27th, 2024 / 5:15 p.m.
See context

Liberal

Taleeb Noormohamed Liberal Vancouver Granville, BC

Thank you so much.

It's nice to finally get to a place where we can have a good conversation about this bill.

Senator, I think we all agree that what you are trying to accomplish is very important, and I think there are many means by which to do that. I would submit that Bill C-63 takes into consideration many of the issues you seek to resolve.

One of the concerns I did want to hear from you on is the whole issue of online privacy. Could you briefly explain what impact this bill might have on online privacy for Canadians? Would there be any concerns with respect to the privacy of online users?

May 27th, 2024 / 12:50 p.m.
See context

Bloc

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Madam Chair.

I want to thank the four witnesses for joining us today.

What's happening on campuses right now is very concerning. I think you are experiencing somewhat similar situations across Canada, including in Quebec.

You have been talking about this since the beginning of your remarks, but I would like to hear you talk more about the challenge that arises when it comes to respecting freedom of expression while avoiding hate speech or outbursts of that nature.

In my view, a university has always been a hotbed for exchanges, even heated exchanges, among students and professors on various subjects, including the thorniest ones. I'm always a little troubled when we talk about limiting freedom of expression, especially at a university.

That said, we believe that hate speech is unacceptable. However, it is difficult to define what is hate and what is not. As we said earlier, Bill C‑63 proposes provisions in this regard.

Another thing I find problematic is what is called the religious exception in the Criminal Code, which allows hate speech or antisemitic speech based on a religious text.

All these things are problematic. I will try to summarize by asking the witnesses my questions in the order in which their names appear on the notice of meeting.

Mr. Carr, at Concordia University, how do you plan to combat the problem of hate speech while respecting freedom of expression? Do encampments actually play an important role in terms of hate speech and freedom of expression?

May 27th, 2024 / 12:10 p.m.
See context

D/Chief Robert Johnson

I've recently reviewed Bill C-63, the online harms act, and I do support it.

May 27th, 2024 / 12:10 p.m.
See context

Liberal

The Chair Liberal Lena Metlege Diab

Thank you very much to our witnesses.

As the chair, I have one quick question for the police.

You talked in your recommendation about hate crime. The online harms act that's been introduced, Bill C-63, attempts to enshrine the definition of hatred in the Criminal Code. I'd like to know if you support that or if you have any recommendations on it.

Before you answer that, I will say to all our witnesses, please submit anything in writing that you feel that you did not get a chance to get out here this morning.

We have 30 seconds for to the police specifically on the hate crime definition.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 11 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I think the track record of the previous Harper government, in which the Leader of the Opposition played a part in its cabinet, is demonstrably curious with respect to that barbaric cultural practices hotline suggestion, with respect to interdictions on the citizenship ceremonies and what people could wear, and with respect to approaches towards settlement of Syrian refugees and who would be selected for settlement in Canada and who would not. The track record is not an enviable one.

On this side of the House, we stand completely opposed to such policies and have implemented policies that are vastly different. That includes challenging Islamophobia. That includes funding for the security infrastructure program to protect places of worship. That includes Bill C-63, which would tackle Islamophobia head-on and help keep all Canadians safe.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 10:55 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I think that is actually appalling, given where we are with the alarming rise in anti-Semitism post October 7. We need to be doing everything we can to shore up the Jewish community and its need for safety and security at this time.

Apropos of that, I find it very troubling that the opposition articulated by the Leader of the Opposition to a bill that I am shepherding through this chamber, Bill C-63, was so vociferous that he did not even wait to read the document. He came out against it before it was even tabled. This is the very same document that groups like CIJA have gone on record about, saying that if we tackle online hatred, we will help them stop anti-Semitism online from turning into real-world consequences in the physical world.

Bill C-63 is critical for the safety of the Jewish community, as it is critical for many vulnerable groups, including Muslims and Arabs in the LGBTQ community, the Black community and the indigenous community. That is what we need to stand for as Canadians. That is what the opposition leader is standing against.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 10:20 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I would be very open to looking at what is transpiring in California. Centring victims at the heart of our criminal justice strategy is important, and we have been attempting to do that with respect to victims of hatred, through the online hate bill; victims of child sex predation, through Bill C-63; victims of intimate partner violence, through our changes to the bail regime, not once but twice, through Bill C-48 and Bill C-75; and fundamentally, victims of gun violence in this country, through bills like Bill C-21, which would put a freeze on handgun sales and ensure tougher penalties with respect to things like gun trafficking. These are important provisions, but I am definitely willing to entertain suggestions about what California is doing and look at whether the model could be brought over.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 10:20 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I have a few responses. First of all, Bill C-63 contemplates a responsibility to file a digital safety plan with the new commissioner to indicate how one is going to moderate risk for one's users, and lastly, to be vetted against that moderation and to be subject to penalties or orders by the digital safety commissioner.

It also contemplates the idea that the digital safety commissioner could green-light researchers at universities around the country to get access to some of the inner workings of the platforms. This has been hailed by people like Frances Haugen, the famous Facebook whistle-blower, as internationally leading legislation on promoting some of the transparency the member opposite is seeking, which I seek as well.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 10:20 p.m.
See context

Green

Elizabeth May Green Saanich—Gulf Islands, BC

Mr. Speaker, I would like to turn to Bill C-63. I support Bill C-63, the online hate bill, but I do not think it adequately gets to some of the questions of algorithms.

I think we have a real problem with rage farming. Some of the examples I have raised tonight are specifically useful because they raise ire and quick reaction and can be used to change public opinion through the manufacturing of a degree of rage that might otherwise not exist if all the facts were thoroughly discussed.

Does the minister believe that Bill C-63 could get at something like rage farming without getting at the algorithms?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 10:05 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, first, I would like to point out that Meta's response was also surprising, because there are a lot of penalties set out in Bill C-63, but Meta is still comfortable working with us.

With regard to the second question, I want to say that we stand up for the protection of both official languages across Canada under the Official Languages Act.

If that means giving the courts and the federal court administration across Canada more funding, then we are there to listen to those concerns and provide the resources necessary to improve access to justice in both official languages, including French, for all Canadians.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 10 p.m.
See context

Bloc

Martin Champoux Bloc Drummond, QC

Mr. Speaker, I will move on to another subject that I think is extremely important: Bill C-63.

Earlier this evening, my colleague, the member for Avignon—La Mitis—Matane—Matapédia, addressed this issue, among others, regarding the Bloc Québécois's suggestion to split part 1 of Bill C‑63 from the other parts so that the digital safety commission can be created as quickly as possible.

My concern is that we are all witnessing and aware of an appalling proliferation of hateful content on social media, including disinformation and aggressive fake accounts, often directed at vulnerable individuals or groups. This should be very worrisome not just to individuals, but to society as a whole.

How does the minister intend to pass a bill that is already being challenged, in a time frame that reflects the urgency of the situation?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 8:55 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, we have tabled that legislation. We are looking forward to having it voted on in the House and proceeding to committee as fast as possible because the luring she mentioned is child predation. It is something that she and I hopefully can agree that we need to cure. That is one of the things that would be tackled through this legislation, among other things.

She has been spending a lot of time talking about women's rights. Women who are cowered through revenge porn would also be addressed through Bill C-63 because it is a second form of content that would be subject to a 24-hour takedown requirement. Surely we can agree on the necessity of prioritizing—

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 8:55 p.m.
See context

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Madam Chair, we have an increase of 815% under this minister's watch involving online sexual luring. He is trying to distract. He does not want to answer the questions. He is the one who brought up his proverbial Bill C-63 that is going to solve all these problems. He said Canada is not unsafe, yet we have stats that show an increase of 101% increase in gun crime.

Why, if Bill C-63 is so important and he is so worried about public safety and so worried about victims, has he not brought it forward to the House?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 8:55 p.m.
See context

Conservative

Michelle Ferreri Conservative Peterborough—Kawartha, ON

Madam Chair, it is absolutely desperate and pathetic, and that is a shameful response.

This is my last question. The minister says he is so concerned about Bill C-63, which he is in charge of bringing forward to the House. If it so important to protect children, why has he not done it?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 8:45 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, I would say that we can start by moving with pace on Bill C-63. It talks about the fact that hate crimes are up 130% over the last five years in this country. We know that the hatred people are exposed to online has real-world consequences. Look no further than the trials of the individuals who were killed at the Quebec City mosque and the trials of the Afzaal family, who were killed in London, Ontario.

How do we cure this? We take a Supreme Court definition of hatred and entrench it in law. That is something that law enforcement has asked us for. Again, I hope the members opposite are listening. Law enforcement and police officers have asked us for these changes because they want to facilitate the work of their hate crimes units in identifying what is happening and laying charges for what is happening. By enhancing penalties under the Criminal Code, by entrenching a definition of hatred in the Canadian Human Rights Act that facilitates discrimination complaints for online hate speech and by ensuring that we are having this content addressed by social media platforms, we can address this at multiple angles.

This is critical toward keeping people safe, now more than ever, when hatred is on the rise, whether it is the anti-Semitism the member just spoke about, whether it is the Islamophobia we have seen with such fatal consequences, whether it is attacks towards the LGBTQ2 community or whether it is attacks against indigenous people in the Prairies. This is rife right now. The time to act is now, not at some future date, to keep Canadians safe. This must to be a priority for every parliamentarian here. Does that mean that we have the perfect bill? Absolutely, it does not mean that. I am open to amendments. We need to get this bill to the justice committee so that we can hear from experts about how a good bill can be strengthened further.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 8:45 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, what I said at that committee, I will say again here: the Lego in my basement is subject to more restrictions than the screens my children are on. That has to change.

We need to change the incentivization on social media companies from monetary incentivization to safety incentivization. This legislation would create a duty to protect children and a duty to remove content. I hope the opposition is listening. The prosecution would be facilitated, in terms of child sex predators, by making changes to the Mandatory Reporting Act, such that the evidence must be preserved for one year. Someone will have up to five years to lay a charge. All entities, including social media companies, must report, and they must report to a central clearing facility. That is critical to facilitating the prosecutions. That is what law enforcement has asked us for. That is what the mothers and fathers affected by things like sextortion around this country have asked us for. That is what will help keep kids from being induced to self-harm, which includes, sadly and tragically, suicide in the case of Carson Cleland in Prince George, B.C., and so many other children around this country.

What we understand from the Centre for Child Protection is that 70 times per week they get notifications of sextortion, and that is only the kids who are coming forward. It is critical to address this issue with haste. We need to pass Bill C-63 at second reading and get it to committee to hear from experts about the pressing need for this bill.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 8:45 p.m.
See context

Liberal

Lena Metlege Diab Liberal Halifax West, NS

Madam Chair, I want the minister to speak a little more on this specific topic. I actually received a number of communications in my constituency from parents and grandparents who are very concerned about their children and about the fact that they are so preoccupied these days with online platforms. In fact, my recollection is that the justice minister was at our Standing Committee on Justice and Human Rights, and he said that the most dangerous toys that Canadian families have are the screens their children use.

Can the minister explain that a little further and speak a little more about the measures in Bill C-63? I think that fundamentally it is a very alarming topic to many in my constituency and across the country.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 8:40 p.m.
See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

Madam Chair, by way of addressing a couple of points on extortion, what I would indicate for the benefit of the House is that we have announced an RCMP national coordination and support team to help coordinate investigations of extortion, and that extortion remains subject to a maximum life imprisonment penalty, which the Supreme Court has indicated demonstrates the seriousness of the offence.

With respect to the question about Bill C-63, I welcome this question. Keeping kids safe is everyone's responsibility in this chamber. This legislation, Bill C-63, would require a takedown within 24 hours of any material that constitutes child sex exploitative material. It would require a risk analysis and a risk reduction of material that induces a child to self-harm or bullies or intimidates a child. That is about doing right by people like Amanda Todd's mother and Rehtaeh Parsons' mother and so many kids who are being sextorted and exploited online.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 8:35 p.m.
See context

Liberal

Lena Metlege Diab Liberal Halifax West, NS

Madam Chair, it is a pleasure to rise today in the chamber. I will be providing remarks and using the remainder of my minutes, after my remarks, with some questions for the minister.

I am pleased to speak this evening to an important keystone of access to justice, and that is legal aid. There are so many things one can speak on, but I have to limit what I can say here tonight in the minutes I have available.

While legal aid is not covered in the appropriations requested under the main estimates, budget 2024 includes measures to increase funding to criminal legal aid as well as legal aid for immigrants and refugees. It also includes new funding for impact of race and culture assessments. These proposed increases are contained within Bill C-69, the budget implementation act, which is now going through Parliament.

I want to give a short preamble to my comments on legal aid.

Our work on access to justice is aligned with broader Government of Canada work to achieve the sustainable development goals, including SDG 16, which speaks to a peaceful, just and inclusive society.

Our government is moving forward on this objective thanks to a person-centred approach. That means that we are focusing on the various needs of people with justice issues. The system must take into account people's situations.

This includes any history of victimization, mental health or substance use. In this vein, we are committed to addressing the root causes of crime, recognizing that this is the most effective way to build safer communities. Fair and equal access to justice also means ensuring respectful and timely processing without discrimination or bias.

We recognize that racism and systemic discrimination exist in our institutions. We know indigenous people, Black people and members of other racialized communities are grossly overrepresented in Canada's criminal justice system as both victims and offenders. In fact, we have heard plenty of testimony on that aspect at the Standing Committee on Justice and Human Rights.

This brings me to the topic of legal aid.

A strong legal aid system is one of the pillars that advances access to justice in our justice system. However, not everyone has equal access to legal aid and representation. Lawyers are costly and the courtroom can be a confusing place.

Legal aid assists economically disadvantaged people in obtaining legal assistance and fair representation. We are committed, together with our provincial and territorial counterparts, to ensuring stable and predictable funding for legal aid so that Canadians can access justice.

Funding for criminal legal aid is marked as a decrease in the main estimates. While it is reflected as such, Bill C-69, and the justice minister addressed this in a previous question, proposes to renew this funding to provide $440 million over five years starting in 2024-25. The renewed funds would support access to justice for Canadians who are unable to pay for legal support.

We know that would be particularly helpful for indigenous people, Black people, members of other racialized communities and people with mental health problems, who are all overrepresented in Canada's criminal justice system.

As I mentioned, improving access to legal aid is possible only with continued collaboration between our governments, the provinces and the territories. The proposed renewed federal contribution will assist them in paving the way to greater access to justice, especially for vulnerable groups. We are also committed to ensuring the ongoing delivery of legal aid in immigration and refugee matters with eight provincial partners. That includes Nova Scotia.

The world is facing an unparalleled flow of migrants and refugees, and Canada is no exception. I have heard their stories, heard about the lives they left behind and heard about the challenges that they have to face in a new country, no matter how welcoming it may be, particularly when they have to deal with unfamiliar, complicated legal processes.

That is why our government is firmly committed to upholding a fair and compassionate refugee protection system. Part of this work is making sure that refugees have access to legal representation, information and advice. That is why budget 2024 proposes to provide $273.7 million over five years, starting in 2024-25, and $43.5 million ongoing to maintain federal support for immigration and refugee legal aid services in eight provinces where services are available. This includes an additional $71.6 million this fiscal year.

The funding will improve access to justice for asylum seekers and others involved in certain immigration proceedings who may not have the means to hire legal representation. Immigration and refugee legal aid supports fair, effective and efficient decision-making on asylum and certain immigration claims by helping individuals present the relevant facts of their case in a clear and comprehensive manner.

To improve these specific legal aid services, Justice Canada works in tandem with provincial governments and legal aid service providers, as well as with Immigration, Refugees and Citizenship Canada. We want to collectively ensure that we have stable and predictable ongoing funding for these important services.

Before I conclude, I also want to touch on another important item that would be supported by Bill C-69, impact of race and culture assessments, which would help the courts understand how racism and discrimination have contributed to a Black or racialized person's interactions with the criminal justice system. Budget 2024 proposes to provide an additional $8 million over five years and $1.6 million ongoing to expand these assessments in more jurisdictions.

On access to justice for all Canadians, we are committing to ensuring that the justice system is fairer for all. I will now continue with the time that I have left to pose a couple of questions to the minister.

My first question is going to centre on the online harms act, Bill C-63. I just want to preface it by saying that the online harms act is something that many of us are very concerned about these days. Obviously, we always were, but the concern is heightened. It is to combat online hate, but it is also to protect our children from sexual exploitation and other harms. One cannot happen without the other.

Can the minister please comment on this, and, specifically, can he explain to Canadians and to the House why is it essential to raise Bill C-63 in the context of protecting our children?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:55 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, I am pleased to see any efforts that deal with combatting hatred, which is unfortunately spiralling in terms of anti-Semitic incidents and Islamophobic incidents. There is a 130% rise in hate crimes in this country in the last five years. That informs the necessity for bills such as Bill C-63, the online harms bill, which will tackle things like hatred and its festering online, which has real-world consequences. It is very unfortunate that Canada ranks number one in the G7 for the number of deaths of Muslims in the last seven years, 11 in total, due to Islamophobic acts of hate.

What I would say, with respect to this bill, is that we are looking at it closely. I would also reiterate for the member's edification that we amended the hate propaganda provisions to include Holocaust denialism and willful promotion of anti-Semitism within the fold of sections 318 and 319, the hate propaganda offences. That was done within the last two years, I believe.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:45 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, I think that the suggestion about hate, the Bloc Québécois's private member's bill and our Bill C-63 highlight the fact that we need to pass this bill at second reading and send it to the Standing Committee on Justice and Human Rights so that we can study it, hear from experts and witnesses and propose amendments, if a few turn out to be appropriate.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Madam Chair, the government is completely ignoring Bill S‑210. Bill C‑63 is a huge bill that has received some criticism. It is likely to take a long time to study.

However, we think the proposal to set up a digital safety commission is a good idea that should be implemented quickly. That is why we are proposing that the bill be split, quite simply, so that we can take the time to properly study all harmful content while still setting up the digital safety commission quickly. I understand that the proposal has not been accepted, but I still think it is a good idea.

The topic of harmful content brings me to hate speech. Will the minister commit to abolishing the Criminal Code exemption that allows hate speech in the name of religion? In fact, that would be a great addition to his Bill C‑63.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, I have several answers to give on this matter. The big difference between the senator's bill and Bill C‑63 is that our bill had the benefit of a five-year consultation. That is the first thing.

The second thing is that, although we agree with some aspects, we want to work in close collaboration with the big digital companies to resolve the situation and protect the public and children from pornography. Taking down that information and content within a mandatory 24-hour period is a much stronger measure than what was proposed in the bill introduced by the senator.

The last thing is that we are targeting a situation where all harmful online content needs to be addressed. This concerns children, teenagers and adults. We want a big solution to a big problem. Australia started nine years ago with children only. Nine years later, protecting children only is no longer appropriate—

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Madam Chair, I politely beg to differ. I feel that Bill C‑63 is extremely important, but it is not exactly the same thing. Yes, it contains elements that make it possible to regulate or, at least, be warned before consuming certain types of content, but there is nothing that really makes it possible to verify the consumer's age.

I would therefore advise the government to support a bill like Bill S‑210. Obviously, it is not easy to implement this type of safeguard, and other countries are currently looking at that. However, it is an extremely important bill.

To return to Bill C‑63, would the minister agree that the first part of the bill could be split from the rest so that the digital security commission could be created as quickly as possible? That would enable us to protect female victims of intimate content communicated without consent, including deepfakes.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, with all due respect, I want to correct the member opposite.

First, Bill C‑63 deals mainly with types of content that are appropriate for children. Second, it addresses the obligation to protect children. There is also a provision of Bill C‑63 that talks about age appropriate design features.

We are targeting the same problem. We want to work with social media platforms to resolve this situation in a way that will enable us to protect people's privacy and personal information and protect children.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Madam Chair, I think that the minister is well aware that those are two completely different missions. Both are commendable.

Bill C‑63 has its good points, but Bill S‑210 really seeks to check the age of pornography users to limit young people's access to it. The Liberal Party seems to disagree with this bill, and yet other countries, like Germany, France and the United Kingdom, as well as some states in the U.S. are looking into this way of verifying the age of users.

Why does Canada not want to move forward in this way to limit the access of children under the age of 18 to pornography?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:40 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Madam Chair, that is a great question, but I believe that the senator's bill, Bill S‑210, addresses only one aspect of our broader bill, C‑63.

Protecting children from pornography and sexual predators is a priority for both me and the senator. However, we have different ways of tackling the problem. We are dealing with a much bigger and broader problem in our own Bill C-63. We are also different when it comes to the mandates and the modus operandi that the senator proposes to use.

We are concerned about how to verify someone's age. Does it have to be a piece of government-issued ID? Will this cause other problems or lead to the possibility of other crimes, such as financial fraud, at the international level?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:35 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Madam Chair, I would point out to the minister that he does not want to give Quebec an exemption from the Criminal Code, but he is giving one to British Columbia. In my view, this is something that is possible for the people in this situation in Quebec.

Now, I would like to hear his comments on all the issues related to child pornography, children's access to pornography and the sharing of non-consensual content. To my eyes, the purpose of Bill S‑210, which was introduced by Senator Julie Miville‑Dechêne and which seeks to prevent minors from accessing pornography, is completely different from the purpose of Bill C‑63, which the minister introduced and which seeks to protect the public from harmful content streamed on social media, such as intimate content communicated without consent and content that sexually victimizes a child.

Does he agree with me that these two bills have completely different purposes?

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:15 p.m.
See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

Mr. Speaker, I will be providing 10 minutes of remarks, and I will be welcoming questions from my parliamentary secretary, the member for Etobicoke—Lakeshore. I will be using my time to discuss measures in the recent budget to combat crime, especially auto theft and money laundering. I will also touch on legal aid investments and provide an update of our work on online safety.

Auto theft is a serious problem that affects communities across the country. Not only does it affect people's wallets, it also causes them to feel unsafe. The number of these thefts has risen and, in some areas, they are growing more violent. These criminals are increasingly emboldened. Our government is committed to ensuring that police and prosecutors have the tools they need to respond to cases of auto theft, including thefts related to organized crime.

We also want to ensure that the legislation provides courts with the wherewithal to impose sentences commensurate with the seriousness of the crime. The Criminal Code already contains useful provisions for fighting auto theft, but we can do more.

This is why we are amending the Criminal Code to provide additional measures for law enforcement and for prosecutors to address auto theft. Bill C-69, the budget implementation act, sets out these proposed measures. These amendments would include new offences targeting auto theft and its links to violence and organized crime; new offences for possession and distribution of a device used for committing auto theft, such as key-programming machines; and a new offence for laundering proceeds of crime for the benefit of, at the direction of, or in association with, a criminal organization. We are proposing a new aggravating factor at sentencing, which would be applied to an adult offender who involves a young person in the commission of the crime. These changes are part of the larger federal action plan on combatting auto theft that was just released on May 20.

Auto theft is a complex crime, and fighting it involves many partners: the federal, provincial, territorial and municipal governments, industry leaders and law enforcement agencies.

I will now turn to the related issue of money laundering. Addressing money laundering will help us to combat organized crime, including its involvement in automobile theft. However, the challenges associated with money laundering and organized crime go beyond auto theft.

That is why we are continually reviewing our laws so that Canada can better combat money laundering, organized crime and terrorist activity financing.

Bill C-69 would give us more tools to combat money laundering and terrorist financing. These new measures would allow courts to issue an order that requires a person to keep an account open to assist in the investigation of a suspected criminal offence. Currently, financial service providers often unilaterally close accounts where they suspect criminal activity, which can actually hinder police investigations. This new proposed order would help in that regard.

I hope to see non-partisan support from all parties, including the official opposition, on these measures to address organized crime. It would be nice to see its members support something, rather than simply use empty slogans or block actual solutions. We see this as well in their efforts to block Bill C-59, the fall economic statement, which has been in this chamber for literally months. That also contains a range of measures to combat money laundering, which have been asked for by law enforcement. For a party that prides itself on having a close relationship with law enforcement, I find this obstruction puzzling.

What is more, under Bill C-69, the courts will also be authorized to make an order for the production of documents for specific dates thanks to a repetitive production order. That will enable law enforcement to ask a person to provide specific information to support a criminal investigation on several pre-determined dates over a defined period. That means that the individual will be required to produce specific information to support a criminal investigation on several pre-determined dates.

These two proposals resulted from the public consultations that our government held last summer. We are committed to getting Bill C-69 passed by Parliament in a timely manner so that the new measures can be put in place as quickly as possible and so that we can crack down on these serious crimes as soon as possible.

I would now like to discuss our investments in legal aid. Just as we need to protect Canadians from crime, we also need to ensure that people have equitable access to justice, which is an integral part of a fair and just society, and a strong legal aid system is a key aspect of this. It strengthens the overall justice system. Budget 2024 includes measures to increase funding to criminal legal aid as well as legal aid for immigrants and for refugees to Canada.

For criminal legal aid, budget 2024 provides $440 million over five years, starting in 2024-25. This would support access to justice for Canadians who are unable to pay for legal support, in particular, indigenous people, individuals who are Black and other racialized communities who are overrepresented in the criminal justice system. Indeed, legal representation helps to clear backlogs and delays in our court system as well.

This essential work is only possible with continued collaboration between federal, provincial and territorial governments. The proposed increase to the federal contribution will assist provinces and territories to take further actions to increase access to justice. This legal aid will help with the backlogs I just mentioned. Unrepresented and poorly represented litigants cause delays in our justice system. Making sure that these individuals have proper support and representation will help ensure access to a speedy trial. This, in combination with our unprecedented pace of judicial appointments, 106 appointments in my first nine months in office, will also address backlogs. In comparison, the previous Harper government would appoint 65 judges per year on average. I exceeded that amount in six months.

For immigration and refugee legal aid, budget 2024 would provide $273.7 million over five years, starting in 2024-25, and $43.5 million per year ongoing after that. This funding would help support access to justice for economically disadvantaged asylum seekers and others involved in immigration proceedings. This investment would help maintain the confidence of Canadians in the government's ability to manage immigration levels, and to resettle and integrate refugees into Canadian society. To do this very important work, Justice Canada continues to collaborate with provincial governments and with legal aid service providers, as well as Immigration, Refugees and Citizenship Canada. Together, we are exploring solutions to support sustainable access to immigration and refugee legal aid services.

Before I conclude, I would like to talk a little about Bill C-63, which was raised by the member for Fundy Royal. The bill addresses online harms and the safety of our communities online. Much has already been said about this very important legislation, which would create stronger protections for children online and better safeguards for everyone in Canada from online hate and other types of harmful content. What is critical about this bill is that it is dedicated to promoting people's participation online and not to limiting it.

This legislation is informed by what we have heard over five-plus years of consultations with diverse stakeholders, community groups, law enforcement and other Canadians. This bill focuses on the baseline responsibilities of social media platforms to manage the content they are hosting and their duty to keep children safe, which means removing certain types of harmful content and entrenching a duty to act responsibly.

This bill is about keeping Canadians safe, which is my fundamental priority and my fundamental duty as the Minister of Justice and Attorney General of this country. It is about ensuring that there is actually a takedown requirement on the two types of most harmful material: child pornography and the non-consensual sharing of intimate images, also known as revenge pornography.

There are five other categories of material that would be dealt with under this bill, including material that includes inciting violence, incitements to terrorism, hatred as defined by the Supreme Court of Canada, bullying a child and also inducing a child to self-harm. I am speaking now not only as the Minister of Justice but also as a father. I think that there is nothing more basic in this country for any parent or parliamentarian than keeping our children safe.

I am thankful for the opportunity to speak about how we are making Canada safer and making our justice system stronger, more accessible and more inclusive for all people.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:05 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I find this line of questioning quite fascinating, given that the main charter issue that is at issue in Bill C-63 deals with very sensitive issues about the protection of freedom of speech, which is protected under section 2(b).

What I will do is always maintain my oath under the Constitution to uphold the Constitution and people's charter rights. This individual works under a leader who has brandished the idea of using the notwithstanding clause to deprive people of their charter rights. Section 2(b) is subject to the notwithstanding clause.

If we are talking about who is actually committed to protecting people's freedoms, including freedom of speech, people on that side of the House should be looking at themselves in the mirror.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 7:05 p.m.
See context

Conservative

Rob Moore Conservative Fundy Royal, NB

Mr. Speaker, I notice once again that I have given the minister a lot of opportunities, and he has not answered any of my questions directly.

He knows the answer to this one, and he is not going to give it, so I will have to give it on his behalf. The Victoria Police Department statement says, “Bill C-75, which came into effect nationally in 2019, legislated a 'principle of restraint' that requires police to release an accused person at the earliest possible opportunity”.

The police laid the blame for this individual being released three times in a row to revictimize Canadians squarely at the feet of the minister. A woman was injured in the process of one of the thefts.

On the issue of the Liberals' draconian Bill C-63, which Margaret Atwood has described as “Orwellian”, has he completed a charter statement for this bill that clearly threatens the rights of Canadians?

May 9th, 2024 / 11:05 a.m.
See context

Philippe Dufresne Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Thank you, Mr. Chair.

Members of the committee, I'm pleased to be here today to discuss the Office of the Privacy Commissioner of Canada's main estimates for fiscal year 2024-25 and to describe the work of my office to protect and promote the fundamental right to privacy of Canadians. I'm accompanied by Richard Roulx, deputy commissioner, corporate management sector.

In January I launched a strategic plan that lays out three key priorities that will guide the work of the OPC through 2027. The first is protecting and promoting privacy with maximum impact, by using business intelligence to identify trends that need attention, producing focused guidance and outreach, leveraging strategic partnerships and preparing for the implementation of potentially new privacy legislation.

The second is addressing and advocating for privacy in this time of technological change, with a focus on artificial intelligence and generative AI, the proliferation of which brings both potential benefits and increased risks to privacy.

The third is championing children's privacy rights to ensure that their unique privacy needs are met and that they can exercise their rights.

I believe that these three priorities are where the Office of the Privacy Commissioner can have the greatest impact for Canadians, and that these are also where the greatest risks lie if the issues are not addressed.

Protecting privacy is one of the paramount challenges of our time. My office is poised to meet this challenge through strong advocacy, collaboration, partnerships, education, promotion, enforcement and capacity building, which includes doing more to identify and address privacy trends in a timely way.

Investigations under the Privacy Act, which covers the personal information-handling practices of federal government departments and agencies, and the Personal Information Protection and Electronic Documents Act, Canada’s federal private sector privacy law, are a key aspect of the Office of the Privacy Commissioner’s work on issues that significantly impact the lives of Canadians.

In February I made public the results of my investigation into Aylo, the operator of the website Pornhub and other pornographic websites. I found that the company had contravened Canada's federal private sector privacy law by enabling intimate images to be shared on its websites without the direct knowledge and consent of everyone who is depicted.

In releasing my report on this investigation, I reiterated that the non-consensual sharing of intimate images is a serious privacy violation that can cause severe harms to victims, and that organizations have an obligation under privacy law to prevent and remedy this.

This case is also relevant to the discussions that will be taking place on Bill C-63, and I will welcome the opportunity to share my views on the online harms act with parliamentarians.

I also look forward to sharing in the coming months the findings of two high-profile investigations that are closely tied to two of my strategic priorities—protecting children’s privacy and addressing the privacy impacts of emerging technology, including AI.

When I appeared before you last year on Main Estimates, I spoke about the launch of investigations into TikTok, as well as OpenAI, the company behind the AI-driven text generation ‘chat bot’ ChatGPT. Both investigations are being conducted jointly with my counterparts in Quebec, British Columbia and Alberta.

In the case of the TikTok investigation, the four offices are examining whether the practices of the company ByteDance comply with Canadian federal and provincial privacy legislation and, in particular, whether valid and meaningful consent is being obtained for the collection, use, and disclosure of personal information.

Given the importance of protecting children's privacy, the joint investigation has a particular focus on TikTok's privacy practices as they relate to younger users.

The investigation into OpenAI and its ChatGPT chat bot is examining whether the company is compliant with requirements under Canadian privacy law in relation to consent, openness, access, accuracy and accountability. It is also considering whether the collection, use and disclosure are done for an appropriate purpose.

Both investigations remain a high priority and we are working to complete them in a timely manner.

Protecting and promoting privacy with maximum impact remains integral to fulfilling my current mandate and preparing for potential changes to federal privacy law.

In the 2023 budget we received temporary funding to address pressures related to privacy breaches and a complaints backlog, as well as to prepare for the implementation of Bill C-27. While these temporary funds provide necessary and immediate support, it is essential that my office be properly resourced on a permanent basis to deal with the increasing complexity of today's privacy landscape and the associated demands on my office's resources.

To address this, we will continue to present fiscally responsible funding requests and will also aim to maximize agility and cost-effectiveness by assessing and streamlining program and service delivery.

With that, I would be happy to answer your questions. Thank you.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:45 p.m.
See context

Conservative

Arnold Viersen Conservative Peace River—Westlock, AB

Mr. Speaker, I am grateful for the opportunity to wrap up the debate on the SISE act at second reading.

I have appreciated listening to the members give their speeches. At the outset, I want to briefly urge members to use the term “child sexual abuse material”, or CSAM, rather than “child pornography”. As we heard from the member for Kamloops—Thompson—Cariboo, the latter term is being replaced with CSAM because pornography allows for the idea that this could be consensual. That is why the member for Kamloops—Thompson—Cariboo has put forward a bill that would change this in the Criminal Code as well.

During the first hour of debate, we heard from the member for Laurentides—Labelle, who gave a passionate speech outlining the many serious issues of the impact of the pornography industry on women and youth. I simply do not have the time to include all of that in my speech, but we both sat on the ethics committee during the Pornhub study and heard directly from the survivors who testified.

It was the speech, however, from the Parliamentary Secretary to the Leader of the Government in the House of Commons that left me scratching my head. I do not think he actually read Bill C-270 or even the Liberals' own bill, Bill C-63. The parliamentary secretary fixated on the 24-hour takedown requirement in Bill C-63 as the solution to this issue. However, I do not think anyone is opposed to a 24-hour takedown for this exploitative intimate content sharing without consent or the child sexual abuse material. In fact, a bill that was solely focused on the 24-hour takedown would pass very quickly through this House with the support of everyone, but that does not take into account what Bill C-270 is trying to do. It is completely missing the point.

The 24-hour takedown has effect only after harmful content has been put up, such as CSAM, deepfakes and intimate images that have been shared. Bill C-270 is a preventative upstream approach. While the takedown mechanism should be available to victims, the goal of Bill C-270 is to go upstream and stop this abusive content from ever ending up on the Internet in the first place.

As I shared at the beginning of the debate, many survivors do not know that their images are online for years. They do not know that this exploitative content has been uploaded. What good would a 24-hour takedown be if they do not even know the content is there? I will repeat the words of one survivor that I shared during the first hour of debate: “I was 17 when videos of me on Pornhub came to my knowledge, and I was only 15 in the videos they've been profiting from.” She did not know for two years that exploitative content of her was being circulated online and sold. That is why Bill C-270 requires age verification and consent of individuals in pornographic material before it is posted.

I would also point out that the primary focus of the government's bill is not to reduce harm to victims. The government's bill requires services “to mitigate the risk that users of the regulated service will be exposed to harmful content”. It talks about users of the platform, not the folks depicted in it. The focus of Bill C-270 is the other side of the screen. Bill C-270 seeks to protect survivors and vulnerable populations from being the harmful content. The two goals could not be more different, and I hope the government is supportive of preventing victims of exploitation from further exploitation online.

My colleague from Esquimalt—Saanich—Sooke also noted that the narrow focus of the SISE act is targeted at people and companies that profit from sexual exploitative content. This is, indeed, one of the primary aims of this bill. I hope, as with many things, that the spread of this exploitative content online will be diminished, as it is driven by profit. The Privacy Commissioner's investigation into Canada's MindGeek found that “MindGeek surely benefits commercially from these non-compliant privacy practices, which result in a larger content volume/stream and library of intimate content on its websites.”

For years, pornography companies have been just turning a blind eye, and it is time to end that. Bill C-270 is a fulfillment of a key recommendation made by the ethics committee three years ago and supported by all parties, including the government. I hope to have the support from all of my colleagues in this place for Bill C-270, and I hope to see it at committee, where we can hear from survivors and experts.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:40 p.m.
See context

Conservative

Garnett Genuis Conservative Sherwood Park—Fort Saskatchewan, AB

Mr. Speaker, I appreciate the opportunity to say a few words in support of Bill C-270, which is an excellent bill from my colleague from Peace River—Westlock, who has been working so hard over his nine years in Parliament to defend the interests of his constituents on important issues like firearms, forestry and fracking, but also to stand up for justice and the recognition of the universal human dignity of all people, including and especially the most vulnerable.

Bill C-270 seeks to create mechanisms for the effective enforcement of substantively already existing legal provisions that prohibit non-consensual distribution of intimate images and child pornography. Right now, as the law stands, it is a criminal offence to produce this type of horrific material, but there are not the appropriate legal mechanisms to prevent the distribution of this material by, for instance, large pornography websites.

It has come to light that Pornhub, which is headquartered in Canada, has completely failed to prevent the presence on its platform of non-consensual and child-depicting pornographic images. This has been a matter that has been studied in great detail at parliamentary committees. My colleague for Peace River—Westlock has played a central role, but other members from other parties have as well, in identifying the fact that Pornhub and other websites have not only failed but have shown no interest in meaningfully protecting potential victims of non-consensual and child pornographic images.

It is already illegal to produce these images. Why, therefore, should it not also be clearly illegal to distribute those images without having the necessary proof of consent? This bill would require that there be verification of age and consent associated with images that are distributed. It is a common-sense legal change that would require and affect greater compliance with existing criminal prohibitions on the creation of these images. It is based on the evidence heard at committee and based on the reality that major pornography websites, many of which are headquartered in Canada, are continuing to allow this material to exist. To clarify, the fact that those images are on those websites means that we desperately need stronger legal tools to protect children and stronger legal tools to protect people who are victims of the non-consensual sharing of their images.

Further, in response to the recognition of the potential harms on children associated with exposure to pornography or associated with having images taken of them and published online, there has been discussion in Parliament and a number of different bills put forward designed to protect children in vulnerable situations. These bills are, most notably, Bill C-270 and Bill S-210.

Bill S-210 would protect children by requiring meaningful age verification for those who are viewing pornography. It is recognized that exposing children to sexual images is a form of child abuse. If an adult were to show videos or pictures to a child of a sexual nature, that would be considered child abuse. However, when websites fail to have meaningful age verification and, therefore, very young children are accessing pornography, there are not currently the legal tools to hold them accountable for that. We need to recognize that exposing young children to sexual images is a form of child abuse, and therefore it is an urgent matter that we pass legislation requiring meaningful age verification. That is Bill S-210.

Then we have Bill C-270, which would protect children in a different context. It would protect children from having their images depicted as part of child pornography. Bill C-270 takes those existing prohibitions further by requiring that those distributing images also have proof of age and consent.

This is common sense; the use of criminal law is appropriate here because we are talking about instances of child sexual abuse. Both Bill S-210 and Bill C-270 deal with child sexual abuse. It should be clear that the criminal law, not some complicated nebulous regulatory regime, is the appropriate mechanism for dealing with child abuse.

In that context, we also have a government bill that has been put forward, Bill C-63, which it calls the online harms act. The proposed bill is kind of a bizarre combination of talking about issues of radically different natures; there are some issues around speech, changes to human rights law and, potentially, attempts to protect children, as we have talked about.

The freedom of speech issues raised by the bill have been well discussed. The government has been denounced from a broad range of quarters, including some of their traditional supporters, for the failures of Bill C-63 on speech.

However, Bill C-63 also profoundly fails to be effective when it comes to child protection and the removal of non-consensual images. It would create a new bureaucratic structure, and it is based on a 24-hour takedown model; it says that if something is identified, it should be taken down within 24 hours. Anybody involved in this area will tell us that 24-hour takedown is totally ineffective, because once something is on the Internet, it is likely to be downloaded and reshared over and over again. The traumatization, the revictimization that happens, continues to happen in the face of a 24-hour takedown model.

This is why we need strong Criminal Code measures to protect children. The Conservative bills, Bill S-210 and Bill C-270, would provide the strong criminal tools to protect children without all the additional problems associated with Bill C-63. I encourage the House to pass these proposed strong child protection Criminal Code-amending bills, Bill S-210 and Bill C-270. They would protect children from child abuse, and given the legal vacuums that exist in this area, there can be no greater, more important objective than protecting children from the kind of violence and sexualization they are currently exposed to.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:30 p.m.
See context

Bloc

Julie Vignola Bloc Beauport—Limoilou, QC

Mr. Speaker, the subject that we are dealing with this evening is a sensitive one. My colleagues have clearly demonstrated that in the last couple of minutes.

We all have access to the Internet and we basically use it for three reasons: for personal reasons, for professional reasons and for leisure, which can sometimes overlap with personal reasons. Pornography is one of those uses that is both for leisure and for personal reasons. To each their own.

The use of pornography is a personal choice that is not illegal. Some people might question that. We might agree or disagree, but it is a personal decision. However, the choice that one person makes for their own pleasure may be the cause of another person's or many other people's nightmare. Basically, that is what Bill C-270 seeks to prevent, what it seeks to sanction. The purpose of the bill is to ensure that people do not have to go through hell because of pornography. This bill seeks to criminalize the fact that, under the guise of legality, some of the images that are being viewed were taken or are being used illegally.

I want to talk briefly about the problem this bill addresses and the solutions that it proposes. Then, to wrap up, I will share some of my own thoughts about it.

For context for this bill and two others that are being studied, Bill S‑210 and C‑63, it was a newspaper article that sounded the alarm. After the article came out, a House of Commons committee that my esteemed colleague from Laurentides—Labelle sits on looked at the issue. At that time, the media informed the public that videos of women and children were available on websites even though these women and, naturally, these children never gave their consent to be filmed or for their video to be shared. We also learned that this included youths under 18. As I said, a committee looked at the issue. The images and testimonies received by the committee members were so shocking that several bills that I mentioned earlier were introduced to try to tackle the issue in whole or in part.

I want to be clear: watching pornography is not the problem—to each their own. If someone likes watching others have sex, that is none of my concern or anyone else's. However, the problem is the lack of consent of the people involved in the video and the use of children, as I have already said.

I am sure that the vast majority of consumers of pornography were horrified to find out that some of the videos they watched may have involved young people under the age of 18. These children sometimes wear makeup to look older. Women could be filmed without their knowledge by a partner or former partner, who then released the video. These are intimate interactions. People have forgotten what intimacy means. If a person agrees to be filmed in an intimate situation because it is kind of exciting or whatever, that is fine, but intimacy, as the word itself implies, does not mean public.

When a young person or an adult decides to show the video to friends to prove how cool it is that they got someone else to do something, that is degrading. It is beyond the pale. It gets to me because I saw that kind of thing in schools. Kids were so pleased with themselves. I am sorry, but it is rarely the girls who are so pleased with themselves. They are the ones who suffer the negative consequences. At the end of the day, they are the ones who get dragged through the mud. Porn sites were no better. They tried to absolve themselves by saying that they just broadcast the stuff and it is not up to them to find out if the person consented or was at least 18. Broadcasting is just as bad as producing without consent. It encourages these illegal, degrading, utterly dehumanizing acts.

I am going back to my notes now. The problem is that everyone is blaming everyone else. The producer says it is fine. The platform says it is fine. Ultimately, governments say the same thing. This is 2024. The Internet is not new. Man being man—and I am talking about humankind, humans in general—we were bound to find ourselves in degrading situations. The government waited far too long to legislate on this issue.

In fact, the committee that looked into the matter could only observe the failure of content moderation practices, as well as the failure to protect people's privacy. Even if the video was taken down, it would resurface because a consumer had downloaded it and thought it was a good idea to upload it again and watch it again. This is unspeakable. It seems to me that people need to use some brain cells. If a video can no longer be found, perhaps there is a reason for that, and the video should not be uploaded again. Thinking and using one's head is not something governments can control, but we have to do everything we can.

What is the purpose of this bill and the other two bills? We want to fight against all forms of sexual exploitation and violence online, end the streaming and marketing of all pornographic material involving minors, prevent and prohibit the streaming of non-consensual explicit content, force adult content companies and streaming services to control the streaming of this content and make them accountable and criminally responsible for the presence of this content on their online sites. Enough with shirking responsibility. Enough with saying: it is not my fault if she feels degraded, if her reputation is ruined and if, at the end of the day, she feels like throwing herself off a bridge. Yes, the person who distributes pornographic material and the person who makes it are equally responsible.

Bill C‑270 defines the word “consent” and the expression “pornographic material”, which is good. It adds two new penalties. Essentially, a person who makes or distributes the material must ensure that the person involved in the video is 18 and has given their express consent. If the distributor does not ask for it and does not require it, they are at fault.

We must also think about some of the terms, such as “privacy”, “education”, but also the definition of “distributor” because Bill C-270 focuses primarily on distributors for commercial purposes. However, there are other distributors who are not in this for commercial purposes. That is not nearly as pretty. I believe we need to think about that aspect. Perhaps legal consumers of pornography would like to see their rights protected.

I will end with just one sentence: A real statesperson protects the dignity of the weak. That is our role.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:20 p.m.
See context

Etobicoke—Lakeshore Ontario

Liberal

James Maloney LiberalParliamentary Secretary to the Minister of Justice and Attorney General of Canada

Mr. Speaker, I am very pleased to speak to Bill C-270, an act to amend the Criminal Code (pornographic material), at second reading.

I would like to begin my remarks by stressing the bill's important objective. It is to ensure that those who make, distribute or advertise pornographic material verify that those depicted in that material are at least 18 years of age and have consented to its production and distribution.

As the sponsor has explained, the bill's objective is to implement recommendation number two of the 2021 report of the House of Commons Standing Committee on Access to Information, Privacy and Ethics. Specifically, that report recommends that the government “mandate that content-hosting platforms operating in Canada require affirmation from all persons depicted in pornographic content, before it can be uploaded, that they are 18 years old or older and that they consent to its distribution”.

This recommendation responds to ongoing concerns that corporations like Pornhub have made available pornographic images of persons who did not consent or were underage. I want to recognize and acknowledge that this conduct has caused those depicted in that material extreme suffering. I agree that we must do everything we can to protect those who have been subjected to this trauma and to prevent it from occurring in the first place. I fully support the objective of the committee's recommendation.

I want to say at the outset that the government will be supporting this bill, Bill C-270, at second reading, but with some serious reservations. I have some concerns about the bill's ability to achieve the objective of the committee's recommendation. I look forward, at committee, to where we can hear from experts on whether this bill would be useful in combatting child pornography.

The bill proposes Criminal Code offences that would prohibit making, distributing or advertising pornographic material, without first verifying the age and consent of those depicted by examining legal documentation and securing formal written consent. These offences would not just apply to corporations. They would also apply to individuals who make or distribute pornographic material of themselves and others to generate income, a practice that is legal and that we know has increased in recent years due to financial hardship, including that caused by the pandemic.

Individuals who informally make or distribute pornographic material of themselves and of people they know are unlikely to verify age by examining legal documentation, especially if they already know the age of those participating in the creation of the material. They are also unlikely to secure formal written consent. It concerns me that such people would be criminalized by the bill's proposed offences, where they knew that everyone implicated was consenting and of age, merely because they did not comply with the bill's proposed regulatory regime governing how age and consent must be verified.

Who is most likely to engage in this conduct? The marginalized people who have been most impacted by the pandemic, in particular sex workers, who are disproportionately women and members of the 2SLGBTQI+ communities. Notably, the privacy and ethics committee clearly stated that its goal was “in no way to challenge the legality of pornography involving consenting adults or to negatively impact sex workers.” However, I fear that the bill's proposed reforms could very well have this effect.

I am also concerned that this approach is not consistent with the basic principles of criminal law. Such principles require criminal offences to have a fault or a mental element, for example, that the accused knew or was reckless as to whether those depicted in the pornographic material did not consent or were not of age. This concern is exacerbated by the fact that the bill would place the burden on the accused to establish that they took the necessary steps to verify age and consent to avoid criminal liability. However, basic principles of criminal law specify that persons accused of criminal offences need only raise a reasonable doubt as to whether they committed the offence to avoid criminal liability.

I would also note that the committee did not specifically contemplate a criminal law response to its concerns. In fact, a regulatory response that applies to corporations that make, distribute or advertise pornographic material may be better positioned to achieve the objectives of the bill. For example, our government's bill, Bill C-63, which would enact the online harms act, would achieve many of Bill C-270's objectives. In particular, the online harms act would target seven types of harmful content, including content that sexually victimizes a child or revictimizes a survivor, and intimate content communicated without consent.

Social media services would be subjected to three duties: to act responsibly, to protect children and to make content inaccessible that sexually victimizes a child or revictimizes a survivor, as well as intimate images posted without consent.

These duties would apply to social media services, including livestreaming and user-uploaded adult content services. They would require social media services to actively reduce the risk of exposure to harmful content on their services; provide clear and accessible ways to flag harmful content and block users; put in place special protections for children; take action to address child sexual exploitation and the non-consensual posting of intimate content, including deepfake sexual images; and publish transparency reports.

Bill C-63 would also create a new digital safety commission to administer this regulatory framework and to improve the investigation of child pornography cases through amendments to the Mandatory Reporting Act. That act requires Internet service providers to report to police when they have reasonable grounds to believe their service is being used to commit a child pornography offence. Failure to comply with this obligation can result in severe penalties.

As I know we are all aware, the Criminal Code also covers a range of offences that address aspects of the concerns animating the proposed bill. Of course, making and distributing child pornography are both already offences under the Criminal Code. As well, making pornography without the depicted person's knowledge can constitute voyeurism, and filming or distributing a recording of a sexual assault constitutes obscenity. Also, distributing intimate images without the consent of the person depicted in those images constitutes non-consensual distribution of intimate images, and the Criminal Code authorizes courts to order the takedown or removal of non-consensual intimate images and child pornography.

All these offences apply to both individuals and organizations, including corporations, as set out in section 2 of the Criminal Code. Should parliamentarians choose to pursue a criminal response to the concerns the proposed bill seeks to address, we may want to reflect upon whether the bill's objectives should be construed differently and its provisions amended accordingly.

I look forward to further studying such an important bill at committee.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 5:50 p.m.
See context

Bloc

Andréanne Larouche Bloc Shefford, QC

Mr. Speaker, as the member for Shefford and the Bloc Québécois critic for the status of women, I want to say that we support Bill C-270 in principle. We would like to examine this bill in committee. The Bloc Québécois fully supports the bill's stated objective, which is to combat child pornography and the distribution and commercialization of non-consensual pornography.

Since the first warning about the tragedy of women and girls whose sexual exploitation is the source of profits for major online porn companies, the Bloc Québécois has been involved at every stage and at all times in the public process to expose the extent of this public problem, which goes to our core values, including the right to dignity, safety and equality.

On this subject of online sexual exploitation, as on all facets and forms of the sexual exploitation of women, we want to stand as allies not only of the victims, but also of all the women who are taking action to combat violence and exploitation. I will begin by giving a little background on the topic, then I will explain the bill and, in closing, I will expand on some of the other problems that exist in Canada.

First, let us not forget that the public was alerted to the presence of non-consensual child pornography by an article that was published in the New York Times on December 4, 2020. The article reported the poignant story of 14-year old Serena K. Fleites. Explicit videos of her were posted on the website Pornhub without her consent.

This Parliament has already heard the devastating, distressing and appalling testimony of young Serena, which helped us understand the sensitive nature and gravity of the issue, but also the perverse mechanisms that porn streaming platforms use to get rich by exploiting the flaws of a technological system that, far from successfully controlling the content that is broadcast, is built and designed to promote and yet conceal the criminal practices of sexual exploitation.

Reports regarding the presence of child sexual abuse material and other non-consensual content on the adult platform Pornhub led the Standing Committee on Access to Information, Privacy and Ethics to undertake a study on the protection of privacy and reputation on online platforms such as Pornhub. My colleague from Laurentides—Labelle has followed this issue closely.

The committee noted that these platforms' content moderation practices had failed to protect privacy and reputation and had failed to prevent child sexual abuse material from being uploaded, despite statements by representatives of MindGeek and Pornhub who testified before the committee.

That same committee looked at regulating adult sites and online pornography, without challenging the legality. The committee heard testimony from survivors, critics of MindGeek's practices, child protection organizations, members of law enforcement, the federal government, academics, experts and support organizations, and it received many briefs.

The Standing Committee on Access to Information, Privacy and Ethics made 14 recommendations regarding the problems it had studied. The committee's 2021 report was clear and it recommended that the government introduce a bill to create a new regulator to ensure that online platforms remove harmful content, including depictions of child sexual exploitation and non-consensual images.

We know that sexually explicit content is being uploaded to Pornhub without the consent of the individuals involved, including minors, and that these individuals have tried and failed to get Pornhub to remove that content. We know that these survivors have been traumatized and harassed and that most of them have thought about suicide. That is the type of testimony that we heard at the Standing Committee on the Status of Women with regard to cases of sexual exploitation.

We know that even if content is finally removed, users just re-upload it shortly afterward. We know that the corporate structure of MindGeek, which was renamed Aylo last August, is the quintessential model for avoiding accountability, transparency and liability. We know that investigations are under way and that there has been a surge in online child sexual exploitation reports.

We must now legislate to respond to these crimes and deal with these problems. We also need to keep in mind the magnitude of the criminal allegations and the misconduct of which these companies are accused. Just recently, a new class action lawsuit was filed in the United States against MindGeek and many of the sites it owns, including Pornhub, over allegations of sex trafficking involving tens of thousands of children.

Let us not forget that these companies are headquartered right in Montreal. The fact that our country is home to mafia-style companies that profit from sexual exploitation is nothing to be proud of. The international community is well aware of this, and it reflects poorly on us. For these reasons, we have an additional obligation to take action, to find solutions that will put an end to sexual exploitation, and to implement those solutions through legislation.

With that in mind, we must use the following questions to guide our thinking. Are legislative proposals on this subject putting forward the right solutions? Will they be effective at controlling online sexual exploitation and, specifically, preventing the distribution of non-consensual content and pornographic content involving minors?

Second, let us talk a little more about Bill C‑270. This bill forces producers of pornographic material to obtain the consent of individuals and to ensure that they are of age. In addition, distributors will have to obtain written confirmation from producers that the individuals' consent has been obtained and that they are of age before the material is distributed. These new Criminal Code provisions will require large platforms and producers to have a process for verifying individuals' age and consent, without which they will be subject to fines or imprisonment.

The House will be considering two bills simultaneously. The first is Bill C-270, from the member for Peace River—Westlock, with whom I co-chair the All-Party Parliamentary Group to End Modern Slavery and Human Trafficking. The second is Bill C-63, introduced by the Minister of Justice, which also enacts new online harms legislation and aims to combat the sexual victimization of children and to make intimate content communicated without consent inaccessible.

We will need to achieve our goals, which are to combat all forms of online sexual exploitation and violence, stop the distribution and marketing of all pornographic material involving minors, prevent and prohibit the distribution of explicit non-consensual content, force adult content companies and platforms to control the distribution of such content, and make them accountable and criminally responsible for the presence of such content on their online platforms.

There is a debate about the law's ability to make platforms accountable for hosted content. It also raises questions about the relevance of self-regulation in the pornography industry.

Third, let us talk about what we can do here. Due to the high volume of complaints it receives, the RCMP often reacts to matters relating to child sexual abuse material, or CSAM, rather than acting proactively to prevent them. Canada's criminal legislation prohibits child pornography, but also other behaviours aimed at facilitating the commission of a sexual offence against a minor. It prohibits voyeurism and the non-consensual distribution of intimate images. Other offences of general application such as criminal harassment and human trafficking may also apply depending on the circumstances.

In closing, I will provide a few figures to illustrate the scope of this problem. Between 2014 and 2022, there were 15,630 incidents of police-reported online sexual offences against children and 45,816 incidents of online child pornography. The overall rate of police-reported online child sexual exploitation incidents has also risen since 2014. The rate of online child pornography increased 290% between 2014 and 2022. Girls were overrepresented as victims for all offence types over that nine-year period. The majority of victims of police-reported online sexual offences against children were girls, particularly girls between the ages of 12 and 17, who accounted for 71% of victims.

Incidents of non-consensual distribution of intimate images most often involved a youth victim and a youth accused. Nearly all child and youth victims, 97% to be exact, between 2015 to 2022 were aged 12 to 17 years, with a median age of 15 years for girls and 14 years for boys. Overall, nine in 10 accused persons, or 90%, were youth aged 12 to 17. For one-third of youth victims, or 33%, a casual acquaintance had shared the victim's intimate images with others.

Here is a quote from the Montreal Council of Women: “On behalf of the members of the Montreal Council of Women, I wish to confirm our profound concern for those whose lives have been turned upside down by the involuntary and/or non-consensual sharing of their images on websites and other platforms such as the Montreal-based Pornhub. The ‘stopping Internet sexual exploitation act’ will make much-needed amendments to the Criminal Code to protect children and those who have not given consent for their images and other content to be shared and commercialized.”

We must act. It is a question of safety for our women and girls. Young women and girls are depending on it.

May 2nd, 2024 / 1:05 p.m.
See context

Strategic Advisor and Foresight Expert, As an Individual

Sanjay Khanna

I would agree with Mr. Finkelstein on the question of having an investigatory capacity. I think that's important.

More broadly, I think it's really about looking at the offices of Parliament and thinking about how to protect your ability as parliamentarians to ensure what you're working with as your ground truth is based on fact: how to do that, how to train your staff and how to build their capacity to be resilient so that everyone who then interacts with you, whether it's your constituents or others, knows that you're at least a trusted source. I'm not talking about your policy positions. I'm talking about the ground truth that you're using to make decisions. I think parliamentary staff are going to be more targeted by these technologies, such as deepfake videos, manipulated voices and those sorts of things.

The other piece, then, is how you are going to protect the body of Canadian society that is your constituents and how you are going to protect the next generations. This is why I was suggesting a Canadian charter of digital rights and freedoms that outlines both the responsibilities and protections of Canadian citizens. I know that there's been a lot on the online harms act, but I don't think it clarifies to citizens what their responsibilities are and what protections may be available to them.

I'll stop there because I know we have limited time, but thank you very much for that question.

April 30th, 2024 / 12:20 p.m.
See context

Associate Professor of Journalism, Media School, UQAM, As an Individual

Patrick White

Canada is already working hard with what it did with Bill C-18 and Bill C-11 for Canadian content, and with Bill C-63 it's going to fight misinformation and contenu préjudiciable as well. Are we doing enough? Probably not, but AI is an opportunity as well as a threat.

As far as deepfakes are concerned, I would strongly urge the government to legislate on that matter within the next 12 to 18 months, especially on deepfake videos and deepfake audio, as well, which you mentioned.

We have a lot to work on in the next 12 months on that issue, taking into context the upcoming federal election in Canada.

April 30th, 2024 / 12:05 p.m.
See context

Patrick White Associate Professor of Journalism, Media School, UQAM, As an Individual

Good afternoon, everyone.

I'd like to thank the committee members for the invitation.

I've been a journalist since 1990 and a professor of journalism at Université du Québec à Montréal for five years.

I believe that 2024 represents a crossroads for disinformation and misinformation. Content automation has proliferated with the launch of the ChatGPT 3.5 AI chatbot in 2022. Not only that, but a Massachusetts Institute of Technology study published in 2018 shows that false news has been circulating six times faster on Twitter than fact-checked news. That's cause for concern.

Things have gotten worse on X, formerly called Twitter, over the past 18 months, since it was taken over by businessman Elon Musk, as a result of several announcements, including the possibility of acquiring a blue checkmark, meaning verified status, simply by paying a few dollars a month, along with the reinstatement of accounts like the one held by former U.S. President Trump, who is himself a major vector of disinformation.

These social network algorithms clearly promote content that generates the most traffic, meaning comments, “likes” and sharing, which amplifies the spread of extreme ideas that we've been seeing in recent years.

One current concern is Meta's blocking of news on Facebook and Instagram in Canada since the summer of 2023, which further fuels the growth of disinformation and misinformation by suppressing news from Canadian media, except for sports and cultural news.

A recently published study that was quoted by Reuters says:

comments and shares of what it categorised as “unreliable” sources climbed to 6.9% in Canada in the 90 days after the ban, compared to 2.2% in the 90 days before.

On the political side of things, I believe efforts should be made to get the news back on Facebook and Instagram by the end of 2024, before Canada's federal elections. The repercussions of this disinformation are political. For example, on Instagram, you now have to click on a tab to see political publications. They've been purposely blocked or restricted by Meta for several months now. The experience is unpleasant for Canadians on Facebook, because more and more content of interest to them from major Canadian media outlets is being replaced by junk news. This reduces the scope of what people are seeing, is harmful to democracy, and also leads to less traffic on news sites. According to a recently published study from McGill University, to which our colleague who testified earlier contributed, news is being replaced by memes on Facebook. It reports the disappearance of five million to eight million views per day of informational content in Canada.

The Canadian government will also have to take rapid action on the issue of artificial intelligence by prohibiting the dissemination of AI-generated content, like deep fake images and audio. Bill C-63 is a partial response to prejudicial content, but it doesn't go far enough. More transparency is needed with respect to AI-generated content.

Oversight is also urgently needed for intellectual property. The Montreal newspaper Le Devoir ran an article about that this morning. What are the boundaries? I encourage you to quickly develop legislation to address this issue, rather than wait 30 years, as was the case for Bill C-11.

Canadian parliamentarians also need to declare war on content farms that produce false news on request about our country and other countries. Foreign governments like China's and Russia's often use that strategy. We mustn't forget that 140 million people were exposed to false news in the United States during the 2020 election. That's clearly very troubling in view of the coming U.S. election this fall. I am also amazed that Canada has been allowing the Chinese Communist Party to continue spreading propaganda press releases on the Canadian Cision newswire for years.

To conclude, I'll be happy to answer your questions. Canada needs to be on a war footing against disinformation, whether generated by artificial intelligence or manually. Stricter rules are required for generative artificial intelligence and for the protection of intellectual property owned by Canadian media and artists, who should be benefiting from these technological advances over the coming years.

Thank you.

April 29th, 2024 / 11:20 a.m.
See context

Senior Assistant Deputy Minister, Strategy and Innovation Policy Sector, Department of Industry

Mark Schaan

At this time, no federal legislation defines the age of minority or majority. The only age defined is the voting age, which is set at 18. However, that has nothing to do with the concept of majority.

Bill C‑63 on harmful content online is currently proposing that the age of majority be set at 18 in the digital world.

That said, right now only the provinces and territories, based on vital statistics, determine the age of majority and minority in Canada.

Public SafetyAdjournment Proceedings

April 18th, 2024 / 6:30 p.m.
See context

Whitby Ontario

Liberal

Ryan Turnbull LiberalParliamentary Secretary to the Minister of Innovation

Madam Speaker, the member certainly could consider supporting the government's online harms bill, which I think is a major piece of legislation that certainly will help to protect minors and children when they are interacting online.

I appreciate this opportunity to speak about the ongoing threat of extortion in Canada. The Government of Canada is deeply concerned about Canadians who are victimized by acts of extortion and related violence. The Government of Canada is aware of growing concerns related to extortion across the country and, indeed, the government has heard directly from the mayors of Surrey, British Columbia; Edmonton, Alberta; and Brampton, Ontario, about how this is impacting their communities.

The recent increase in the number and severity of extortion attempts, particularly targeting members of Canada's South Asian community are alarming. The Government of Canada and the RCMP encourage anyone experiencing or witnessing extortion to report it to their local police of jurisdiction and discourage anyone from complying with demands for money.

Rest assured, the Government of Canada is committed to protecting the safety of Canadians and Canadian interests against these threats. We are taking concrete action to protect all affected communities across Canada.

As Canada's national police force, the Royal Canadian Mounted Police is mandated to prevent, detect and investigate serious organized crime, in order to protect Canadians and Canadian interests. In doing so, the RCMP works closely with domestic and international law enforcement partners to share information and target shared threats. The RCMP and its law enforcement partners across the country have observed an increase in the number of extortion crimes taking place and are working collaboratively to investigate these incidents.

While the RCMP cannot comment on specific investigations, I can confirm that significant coordination is under way across the country to address similar types of extortion attempts directed at the South Asian communities in British Columbia, Alberta and Ontario. While many investigations remain ongoing, a number of arrests have been made, and information sharing across agencies, I would say, is imperative, as coordinated efforts are under way to identify cases that may be related to one another.

To this end, the RCMP is actively sharing information with local law enforcement to support their ongoing efforts.

Rest assured, law enforcement agencies across the country are utilizing the required tools and resources to combat these serious incidents in order to keep Canadians safe.

Financial Statement of Minister of FinanceThe BudgetGovernment Orders

April 18th, 2024 / 3:30 p.m.
See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

Mr. Speaker, I rise today to address budget 2024. I propose to deliver my remarks in two contexts: first, to address how this budget resonates with the residents whom I am privileged to represent in Parkdale—High Park in Toronto; second, to look more largely at some of the very important components that relate to the administration of justice in this country and are touched on in this budget document.

I am proud to have represented, for almost nine years now, the constituents in Parkdale—High Park. What those constituents have talked to me repeatedly about is the need to address housing. In budget 2024, we find some very key provisions that relate to housing. I cannot list them all, but some deal with the pressing issue of building more housing, increasing housing supply. That is fundamental in terms of what we are trying to do as a government, and it is empowered and advanced by this important budget document. What I am speaking of here is, for example, $15 billion in additional contributions to Canada's apartment construction loan program, which will help to build more than 30,000 additional new homes.

What I also take a lot of pride in is the fact that we are addressing the acute needs of renters. I say that in two respects. This budget document outlines, for example, how renters can be empowered to get to the point of home ownership by virtue of having a proper rental payment history. This can contribute to building up one's credit worthiness with credit ratings agencies; when the time comes to actually apply for a mortgage, one will have built up that credit worthiness by demonstrating that one has made regular rent payments over a period of years. This is truly empowering for the renters in my community and communities right around the country. I have already heard that feedback from the renters whom I represent.

Lastly, I would simply point out what we are doing with respect to the tenants' bill of rights. This is a really important document that talks about ensuring that tenants have rights they can vindicate, including in front of tribunals and, potentially, courts of law. We are coupling that with a $15-million investment that would empower and unlock advocates who assist those renters. That is fundamental. In that respect, it actually relates to the two hats that I wear in this chamber, in both my roles as a representative of individual renters and as Minister of Justice.

Another component that my constituents have been speaking to me about regularly since 2015 is our commitment to advancing meaningful reconciliation with indigenous peoples. Again, this document has a number of components that relate to indigenous peoples in budget 2024. There are two that I would highlight for the purpose of these remarks. First, there is the idea about what we are doing to settle litigation against indigenous peoples and ensure that we are proceeding on a better and more conciliatory path forward. We talk about a $23-billion settlement with respect to indigenous groups who are litigating discriminatory underfunding of children and child family services and the fact that this historic settlement was ratified by the federal court. That is critical.

Second, in this document we also talk about funding a project that is near and dear to my heart. Why do I say that? It is because, in 2017, I had the privilege of serving as the parliamentary secretary to the Minister of Heritage. At that time, I helped to co-develop, along with Métis, first nations and Inuit leaders, the legislation that has now become the Indigenous Languages Act. That is coupled with an indigenous languages commission. In this very budget document, we talk about $225 million to ensure the continued success of that commission and the important work it is doing to promote, enhance and revitalize indigenous languages in this country.

Those are fundamental investments. I think it is really important to highlight them in the context of this discussion.

I would also highlight that my riding, I am proud to say, is full of a lot of people who care about women. They care about feminism; they care about social and economic policies that empower women. I would highlight just two. First of all, we talk about pharmacare in this budget. The first volley of pharmaceutical products that will be covered includes contraceptive devices that would assist, as I understand it, as many as nine million Canadians through access to contraception. This would allow women, particularly young women and older women, to ensure that they have control over their reproductive function. That is fundamental to me as a representative, and it is fundamental to our government and what our government prioritizes in this country. I would also say that, with $10-a-day child care, there are affordable and robust means of ensuring that people's children are looked after in this country; that empowers women to do such things as participate in the workforce.

What I am speaking about here is that we are hitting levels of women's participation in the workforce that have never been seen before, with women's labour force participation of 85.4%. That is an incredible social policy that is translating into a terrific economic policy.

We can also talk about the $6.1-billion Canada disability benefit. I am proud to say that the constituents of Parkdale—High Park care meaningfully about inclusive policies, policies that alleviate poverty and are addressed to those who are vulnerable and those who are in need. People have been asking me about the disability benefit, including when we will see it and when it will come to the fore. We are seeing it right now with this document. The very document that we will be voting on in this chamber includes a $6.1-billion funding model to empower Canadians who are disabled and to ensure that we are addressing their needs.

This budget also represents a bit of a catch-up, meaning that we are catching up to the rest of the G7. Until this budget was delivered, we remained the only G7 country in the world not to have a national school food program. It goes without saying that not a single one of the 338 members privileged to serve in this House would think it is good for a child to arrive at school hungry, in any of their communities or in this country as a whole. I do not think this is a partisan statement whatsoever. We would acutely address child hunger. Through a national school food program, we would ensure that children do not arrive at school hungry, which would impede their productivity and certainly limit their education. Through a $1-billion investment, we would cure school poverty and school hunger.

We are also introducing legislation to reduce cellphone and banking fees, which is fundamental.

With respect to the hat I wear as Minister of Justice, which I have done for about eight months, I firmly believe that one of my pivotal roles is ensuring access to justice. I would say that this document really rings true to the commitment that I have personally and that our government and the Prime Minister have to this. Here, I am speaking about the notion of our commitment to legal aid. Legal aid has multiple components, but it is fundamental to ensuring that people can have their rights vindicated with the assistance of counsel. This helps address things such as court backlogs and court delays; it is also fundamental for the individual litigants before the courts. There is a criminal legal aid package in this budget that includes $440 million over five years.

There is also immigration and refugee legal aid. Unfortunately, since the provinces have wholesale resiled from their involvement in this portfolio, since 2019, we have been stepping in with annual funding. We are making that funding no longer simply annual; we are projecting it over a five-year term, which gives certainty and predictability to the people who rely on immigration and refugee legal aid, to the tune of $273 million. That is fundamental.

Members heard in question period about efforts we are making to address workplace sexual harassment. I will pivot again here to the fact that this dovetails with both my ministerial role and my role of devoted constituency representative as the MP for Parkdale—High Park. I hear a great deal from my constituents about speaking to women's needs in terms of addressing harassment and sexual harassment. With this budget, we would provide $30 million over three years to address workplace sexual harassment. That is also fundamental.

Likewise, what we are doing on hatred is fundamental. Three full pages of the budget document are dedicated to addressing hatred. Some points dovetail with legislation that I have tabled in this House, including Bill C-63, regarding what we would do to curb online hatred and its propensity to spread. However, there are also concrete investments here that talk about Canada's action plan on combatting hate and empowering such bodies as the Canadian Race Relations Foundation, with the important work it is doing in terms of promoting better understanding and the knowledge base of hate crimes units. Also, fundamentally, there is money dedicated in this very budget to ensuring that both law enforcement agencies and Crown prosecutors are better trained and provided better information about how to identify hate and potentially prosecute it. With where we are as a country right now, this is a pressing need; I am very proud to see budget 2024 addressing it directly.

For the reasons I outlined earlier, in terms of how this addresses the particular needs of my constituents and for the very replete justice investments that are made to ensuring access to justice and tackling pernicious issues, such as sexual harassment and hatred, I believe this is a budget that all 338 of us should get behind and support.

Alleged Premature Disclosure of Bill C-63—Speaker's RulingPrivilegeOral Questions

April 11th, 2024 / 3:10 p.m.
See context

Liberal

The Speaker Liberal Greg Fergus

I am now ready to rule on the question of privilege raised on February 26, 2024, by the House leader of the official opposition, concerning the alleged premature disclosure of Bill C-63, an act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other acts.

The opposition House leader claimed that the bill's contents had been leaked to the media, as evidenced in two separate reports from CBC and CTV News. Pointing to the anonymous quotes in the news reports, he concluded his remarks by positing that the information was leaked intentionally, knowing that it was wrong. In doing so, it breached the rights of members of Parliament and the House.

For his part, the parliamentary secretary to the government House leader countered that the envisioned legislation's objectives were widely known and already in the public domain long before the bill was placed on notice and introduced, given the government's prior commitments and extensive public consultations. Furthermore, the parliamentary secretary emphatically rejected the allegations that the government had shared the bill before it was introduced.

The House leader of the official opposition is correct in asserting that there are abundant precedents that once a bill is placed on notice, its contents are not to be disclosed prior to introduction, thus ensuring that members have the first opportunity to take note of the bill. The premature disclosure of bills has usually been seen as a contempt of the House.

I will invite MPs to please take their conversations outside of the House, including the member for Scarborough—Guildwood.

In a ruling on October 4, 2010, which can be found at page 4711 of the Debates, Speaker Milliken stated, and I quote:

It is indisputable that it is a well-established practice and accepted convention that this House has the right of first access to the text of bills it will consider.

On the substantive matter raised in this question of privilege, as members know, the policy direction leading to a government bill is not typically developed in the strict isolation of a government department. Prior to the putting on notice and introduction of most modern legislation, extensive consultations and public debate frequently occur for months or even years. Past precedents from the Chair address this reality, and Bill C-63 seems to be another example of that pattern.

On June 8, 2017, Speaker Regan emphasized the need for balance between members' right to have the first opportunity to see the bill and the need for prior public consultation. He said, at page 12320 of the Debates:

The right of the House to first access to legislation is one of our oldest conventions. It does and must, however, coexist with the need of governments to consult widely, with the public and stakeholders alike, on issues and policies in the preparation of legislation.

In the same ruling, Speaker Regan indicated that the denial of a premature disclosure of the bill by the government, and the absence of evidence that members were impeded in the performance of their parliamentary duties, had led him to find that the matter was not a prima facie case of privilege.

Having reviewed the contents of the bill against what was reported in the media, and considering the assurance given by the parliamentary secretary that the government did not share the text of the bill between its placement on notice and its introduction, it cannot be determined that the information that appeared in the news media necessarily came from a premature disclosure of the bill by so-called senior government sources.

The title of the bill, combined with the various sources of information mentioned above, such as background information provided during the consultation process, could have easily informed as to the specific objectives of the bill. There is a plausible argument to be made that the scope, objectives and targets of the bill were known prior to its being placed on notice and introduced.

Not being able to say with certainty that the information in the media reports came from the bill itself, I cannot determine that any member was impeded in the carrying out of their parliamentary duties, or that the dignity of the House was transgressed. As such, the Chair cannot find that there is a prima facie question of privilege.

That being said, the Chair shares the members' concerns when detailed information on proposed legislation, whether accurate or not, appears in media stories prior to their introduction.

It casts doubt on the role and predominance of Parliament in the legislative process and may lead to—

Order. I am going to remind all members that one of the fundamental rules of being a member and being a Speaker in this House is that members are not to question or to insult the Speaker, unless they are doing it through a motion which would call into question the Speaker's role. I would like to remind all members about this fundamental rule. I know that I have had some conversations with members in the past about this.

I will continue.

It casts doubt on the role and predominance of Parliament in the legislative process and may lead to understandable frustration.

I thank all members for their attention.

National DefenceCommittees of the HouseRoutine Proceedings

April 10th, 2024 / 5:15 p.m.
See context

Liberal

Kevin Lamoureux Liberal Winnipeg North, MB

Madam Speaker, the member is so sensitive to us calling out what the Conservative Party is doing. I just finished saying that the most important reality of our Canadian Forces is the families, and he is standing up on a point of order. Does he not realize that the families of the Canadian Forces members are, in fact, what this report is all about?

As someone who was in the Canadian Forces and who was posted in Edmonton, I understand the issue of housing. I understand the pros and cons, the dips and so forth that take place, the waiting list for PMQs, for barracks and the whole process in which housing has evolved in the Canadian Forces, and I understand how important the issue is. I knew this not only today, and it did not necessarily take the report coming to the floor to be debated. This is not new. There has always been waiting lists to get into PMQs since the days when I was in the forces. I had to wait, and I actually lived in a PMQ. There have always been waiting lists.

Why did the Conservative Party wait until today to introduce this motion? If, in fact, Conservatives were genuine and really cared about the families and the Canadian Forces, they could have introduced some form of a motion on an opposition day. They should have done that if they genuinely cared about families and those in the forces representing our country and doing a phenomenal job, whether in Canada or abroad.

The Government of Canada has the backs of those members in the Canadian Forces and their families a lot more than Stephen Harper ever did. When I was first elected to the House of Commons in 2010, Stephen Harper literally closed down veterans offices, not two or three, but nine all over the country.

Members can imagine the veterans who already served in the forces in many different capacities and were going into private homes and facilities, some even in the non-profit area, when Stephen Harper shut down those access offices. In Manitoba, it was in Brandon. I was glad that when we took over the reins of power, we actually reopened those offices to continue to support our veterans.

There are two issues here that really need to be talked about. First and foremost is the motivating factor of the Conservative Party today and why the Conservatives are moving this motion. As the NDP House leader clearly attempted to get this motion passed, the Conservatives said no. It was not because of interest for members of the forces but rather to prevent legislation from being debated.

Just yesterday, I was in the House and had the opportunity to speak to a private member's bill, Bill C-270, which dealt with the issues of child porn and non-consensual porn. I stood in my place and provided commentary on how serious and important that issue is, not only to the government but also to every member inside this chamber. Throughout the debate, we found out that the Conservative Party was actually going to be voting against Bill C-63, which is the online harms act.

That was important to mention because the Conservatives were criticizing the government for not calling the legislation. They were heckling from their seats and were asking why we did not call the legislation if it was so important.

The Conservatives realize that when they bring in motions, as they have done today, they are preventing the government from bringing in legislation and from having debates on legislation. Then, they cry to anyone who will listen. They will tell lies and will do all sorts of things on social media. They spread misinformation to Canadians to try to give the impression that the House and Canada are broken.

There is no entity in the country that causes more dysfunction in the House of Commons, or even outside of the Ottawa bubble, than the Conservative Party of Canada under the leadership of the far right MAGA leader today. That is the core of the problem. They have a leader who genuinely believes and who wants to demonstrate that this chamber is dysfunctional. The only thing that is dysfunctional in this chamber is the Conservative Party. It does not understand what Canadians want to see.

If we look at some of the commitments we are making to the Canadian Armed Forces, we are talking about billions of dollars in the coming years. We have a target, and a lot depends on economic factors, but we are looking at 1.7% by 2030.

Let us contrast that to the Conservative government of Stephen Harper, who was the prime minister when the current Conservative leader was a parliamentary secretary and was a part of that government in a couple of roles. We saw a substantial decrease in funding. I made reference to the veterans and to shutting them down. What about the lack of general funding toward the Canadian Forces? We hit an all-time low under the Conservative Party and Stephen Harper. It was 1% of the GDP. That would be awfully embarrassing to go abroad and to start talking to people in the United States or to any of our ally countries in NATO. They were laughing at the Harper regime.

The Liberal government had to straighten out the problems of the Conservatives' inability to get a jet fighter. For years, they tried and failed. The Liberal government is now delivering on getting the jet fighters. The Liberal government continues to look at ways we can enhance our Canadian Forces, not only for today but also into the future. We will have new search and rescue aircraft that will be operating out of places like the city of Winnipeg.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 6:15 p.m.
See context

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Madam Speaker, I have a lot to say about the bill. I will just start with a brief personal anecdote. I want to be very clear when I say this: I do not do this as victim porn or looking for sympathy. It is an example of how if somebody like myself, in a position of privilege, has a hard time accessing the justice system, what about others?

When I was a minister of the Crown, over 10 years ago, I received very explicit sexualized online threats, very graphic descriptions of how somebody was going to rape me, with what instruments, and how they were going to kill me. I was alone in a hotel room. My schedule had been published the day before, and I was terrified. The response at that time from law enforcement, and the process I had to go through as a minister of the Crown, to attempt to get justice in a situation that did not involve intimate images, sticks with me to this day. If I had to go through that at that time, what hope is there for somebody who does not have my position of privilege?

What the bill would do is recognize that the forms of discrimination and harassment that, as my colleague from Esquimalt—Saanich—Sooke says, disproportionately impact women, sexual minorities and other persons, have outpaced Parliament's ability to change the law. Here we are today.

Briefly, I want to respond to some of the points of debate. First of all, my colleague from the Liberals suggested that we expedite Bill C-63. That bill has been so widely panned by such a variety of disparate stakeholders that the government has not even scheduled it for debate in the House yet.

Second, and this is particularly for my colleagues who are looking to support this, to send the bill through to second reading, Bill C-63 would not provide criminal provisions either for any of the activities that are in the bill or for some of the other instances that have been brought up in the House for debate tonight, particularly the non-consensual distribution of deepnudes and deepfake pornography.

I raised the issue in the House over seven months ago. The intimate image distribution laws that are currently in the Criminal Code were only put in place in 2014, about a decade after social media came into play, and after Rehtaeh Parsons and Amanda Todd tragically died due to an absence in the law. Seven months have passed, and the government could have dealt with updating the Criminal Code with a very narrow provision that the Canadian Bar Association and multiple victims' rights groups have asked for, yet it has chosen not to.

There are so many articles that have been written about what is wrong with what is in Bill C-63 that we now need to start paying attention to what is wrong with it because of what is not in there. There is no update to Canada's Criminal Code provisions on the distribution of intimate images produced by artificial intelligence that are known as deepnudes.

I want to be very clear about this. There are websites right now where anyone in this place can download an app to their phone, upload any image of any person, including any person in here, and imagine what that looks like during an election campaign, erase people's clothes, and make it look like legitimate pornography. Imagine, then, that being distributed on social media without consent. Our Criminal Code, the Canadian Bar Association, as well as law professors, and I could read case after case, say that our laws do not update that.

At the beginning of February, there was a Canadian Press article that said that the government would update the law in Bill C-63, but it did not. Instead, what it chose to do was put in place a three-headed bureaucracy, an entirely extrajudicial process that amounts to a victim of these crimes being told to go to a bureaucratic complaints department instead of being able to get restitution under the law. Do we know what that says to a perpetrator? It says, “Go ahead; do it. There is no justice for you.” It boggles my mind that the government has spent all of this time while countless women and vulnerable Canadians are being harassed right now.

I also want to highlight something my colleague from Esquimalt—Saanich—Sooke said, which is that there is a lack of resources for law enforcement across the country. While everybody had a nice couple of years talking about defunding the police, how many thousands of women across this country, tens of thousands or maybe even millions, experienced online harassment and were told, when they finally got the courage to go to the police, that it was in their head?

One of those women was killed in Calgary recently. Another of those women is Mercedes Stephenson, who talked about her story about trying to get justice for online harassment. If women like Mercedes Stephenson and I have a hard time getting justice, how is a teenager in Winnipeg in a high school supposed to get any sort of justice without clarity in the Criminal Code if there are deepnudes spread about her?

I will tell members how it goes, because it happened in a high school in Winnipeg after I raised this in the House of Commons. I said it was going to happen and it happened. Kids were posting artificial intelligence-generated deepnudes and deepfakes. They were harassing peers, harassing young women. Do members know what happened? No charges were laid. Why were no charges laid? According to the article, it was because of ambiguity in the Criminal Code around artificial intelligence-created deepnudes. Imagine that. Seven months have passed. It is not in Bill C-63.

At least the bill before us is looking at both sides of the coin on the Criminal Code provisions that we need to start looking at. I want to ensure that the government is immediately updating the Criminal Code to say that if it is illegal to distribute intimate images of a person that have been taken with a camera, it should be the exact same thing if it has been generated by a deepnude artificial intelligence. This should have been done a long time ago.

Before Bill C-63 came out, Peter Menzies, the former head of the CRTC, talked about the need to have non-partisan consensus and narrowly scoped bills so it could pass the House, but what the government has chosen to do with Bill C-63 is put in place a broad regulatory system with even more nebulousness on Criminal Code provisions. A lot of people have raised concerns about what the regulatory system would do and whether or not it would actually be able to address these things, and the government has not even allowed the House to debate that yet.

What we have in front of us, from my perspective, is a clear call to action to update the Criminal Code where we can, in narrow provisions, so law enforcement has the tools it needs to ensure that victims of these types of crimes can receive justice. What is happening is that technology is rapidly outpacing our ability to keep up with the law, and women are dying.

I am very pleased to hear the multipartisan nature of debate on these types of issues, and that there is at least a willingness to bring forward these types of initiatives to committee to have the discussions, but it does concern me that the government has eschewed any sort of update of the Criminal Code on a life-versus-life basis for regulators. Essentially what I am worried about is that it is telling victims to go to the complaints department, an extrajudicial process, as opposed to giving law enforcement the tools it needs.

I am sure there will be much more debate on this, but at the end of the day, seven months have passed since I asked the government to update the Criminal Code to ensure that deepnudes and deepfakes are in the Criminal Code under the non-consensual intimate image distribution laws. Certainly what we are talking about here is ensuring that law enforcement has every tool it needs to ensure that women and, as some of my colleagues have raised here, other sexual minorities are not victimized online through these types of technologies.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 6:10 p.m.
See context

NDP

Randall Garrison NDP Esquimalt—Saanich—Sooke, BC

Madam Speaker, New Democrats support, as all parties do, tackling the important issues that the bill before us seeks to tackle. We also know that there has been an explosion of sexual exploitation of individuals online without their consent and an explosion of child pornography. What we have to do is find those measures that would be effective in bringing an end to these heinous practices.

Like the member for Peace River—Westlock, I would like to support and salute the survivors who have told their tales, at much personal sacrifice and much personal anguish, publicly acknowledging what has happened to them and the impact it has had on their lives. We would not be making progress on these issues without that work by those survivors, so I think we all want to salute them for their bravery in taking up this problem.

However, the challenge with these issues is to find what will actually work to end sexual exploitation. We know that a lack of resources for enforcement is almost always at least equally important to any gaps in legislation. What we need to see is dedicated funding to specific and skilled police units to tackle these questions because it can become highly complex and highly convoluted in trying to bring these cases to prosecution, and we know that is one of the problems with the existing legislation. It is difficult to prosecute for these offences under the Criminal Code as it now stands.

We look forward, as New Democrats, to hearing from expert witnesses in committee on what measures will actually be the most effective in bringing an end to these practices, and whether and how the measures proposed in Bill C-270 would contribute to bringing an end to online sexual exploitation. The bill, in some senses, is very simple. It would require checking ID and keeping records of consent. Some would argue that the existing law already implicitly requires that, so is this a step that would make it easier to prosecute? I do not know the answer to that, but I am looking forward to hearing expert testimony on it.

While this legislation is not specific to women, it is important to acknowledge the disproportionate representation of women as victims of both child pornography and of sexual exploitation online without consent. However, I would also note that we have had a recent rash of cases of sexploitation or sextortion of young men who thought they had been speaking to other partners their own age online. They later find out that they were being threatened with the images they had shared being posted online and being asked for money or sexual favours to avoid that. Yes, it is primarily women, but we have seen this other phenomenon occurring where men pose as young women to get young boys to share those images.

Obviously, we need more education for young people on the dangers of sharing intimate images, although I am under no illusion that we can change the way young people relate to each other online and through their phones. Education would be important, but some measures to deal with these things when they happen are also important.

If we look at the Criminal Code, paragraph 162.1(1) already makes it illegal to distribute an intimate image without consent. Of course, child pornography, under a succeeding subsection, is also already illegal. This was first brought forward and added to the Criminal Code 11 years ago. I was a member of Parliament at that time, and the member for Peace River—Westlock joined us shortly after. It came in an omnibus bill brought forward by the Conservatives. In that bill, there were a number of things, to be honest, that New Democrats objected to, but when the bill, which was Bill C-13 at the time, was brought forward, our spokesperson Françoise Boivin offered to the government to split the bill, take out the section on online exploitation without consent and pass it through all stages in a single day. The Conservatives refused, at that point, to do that, and it took another year and a half to get that passed into law.

New Democrats have been supportive in taking these actions and have recognized its urgency for more than a decade. We are on board with getting the bill before us to committee and making sure that we find what is most effective in tackling these problems.

What are the problems? I see that there are principally two.

One, as I have mentioned before, is the difficulty of prosecution and the difficulty of making those who profit from this pay a price. All the prosecutors I have talked to have said that it is difficult to make these cases. It is difficult to investigate, and it is difficult to get convictions. Are there things we can do that would help make prosecution easier, and are the things suggested in the bill going to do that? I look forward to finding that out in committee.

The second problem is the problem of takedown, and we all know that once the images are uploaded, they are there forever. They are hard to get rid of. As members of the government's side have pointed out, there are measures in government Bill C-63 that would help with warrants of seizure, forfeiture, restitution and peace bonds in trying to get more effective action to take down the images once they have been posted. I am not an optimist about the ability to do that, but we seem to lack the tools we need now to make a stab at taking the images off-line. It is also important to remember that whatever we do here has to make our law more effective at getting those who are profiting from the images. That is really what the bill is aimed at, and I salute the member for Peace River—Westlock for that singular focus because I think that is really key.

We also have to be aware of unintended consequences. When subsection 162.1(1) became law, in court we ran into a problem fairly early on of minors who share private images between each other, because technically, under the law as it is written, that is illegal; it is child pornography, and it certainly was not the intention to capture 15-year-olds who share intimate images with each other.

Whenever we make these kinds of changes, we have to make sure they do not have unintended consequences. Whether we like the practices that young people engage in online or not is not the question. We just have to make sure we do not capture innocent people when we are trying to capture those who profit from exploitation. The second part, in terms of unintended consequences, is I think we have to keep in mind there are those who are engaged in lawful forms of sex work online, and we have to make sure they are not captured under the broad strokes of the bill.

Again, I am looking forward to hearing the testimony about what will work to tackle these problems. We know the images are already illegal, but we know we lack effective tools in the legal system both to prosecute and to get the images taken down. New Democrats are broadly supportive of the principles in the bill. We are looking forward to the expert testimony I am certain we will hear at committee about what will actually work in tackling the problem. I look forward to the early passage of the bill through to committee.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:50 p.m.
See context

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Madam Speaker, to be very clear, with regard to the issue of non-consensual pornography and child pornography, I like to believe that every member in the House would be deeply offended by any activity that would ultimately lead to, encourage or promote, in any fashion whatsoever, those two issues. It angers a great number of us, to the degree that it causes all forms of emotions. We all want to do what we can to play an important role in making our online world experience a safer place.

I must say that I was a little surprised when the member for Peace River—Westlock responded to the issue of Bill C-63. I did have some concerns.

When one thinks of non-consensual pornography and child pornography, they are already illegal today in Canada. We know that. I appreciate what is being suggested in the private member's legislation, but he was asked a question in regard to Bill C-63, the government legislation dealing with the online harms act. It is something that is very specific and will actually have a very tangible impact. I do not know 100%, because this is the first time that I heard that members of the Conservative Party might be voting against that legislation. That would go against everything, I would suggest, in principle, that the member opposite talked about in his speech.

The greatest threat today is once that information gets uploaded. How can we possibly contain it? That is, in part, what we should be attempting to deal with as quickly as possible. There was a great deal of consultation and work with stakeholders in all forms to try to deal with that. That is why we have the online harms act before us today.

I wanted to ask the member a question. The question I was going to ask the member is this: Given the very nature of his comments, would he not agree that the House should look at a way in which we could expedite the passage of Bill C-63?

By doing that, we are going to be directly helping some of the individuals the member addressed in his opening comments. The essence of what Bill C-63 does is that it provides an obligation, a legal obligation, for online platforms to take off of their platforms child pornography and non-consensual pornography. For example, the victims of these horrific actions can make contact and see justice because these platforms would have 24 hours to take it off. It brings some justice to the victims.

I do not understand, based on his sincerity and how genuine the member was when he made the presentation of his bill. I have a basic understanding of what the member is trying to accomplish in the legislation, and I think that there are some questions in regard to getting some clarification.

As I indicated, in terms of the idea of child pornography not being illegal, it is illegal today. We need to make that statement very clear. Non-consensual pornography is as well. Both are illegal. There is a consequence to perpetrators today if they are found out. What is missing is how we get those platforms to get rid of those images once those perpetrators start uploading the information and platforms start using the material. That is what the government legislation would provide.

Hopefully before we end the two hours of debate the member can, in his concluding remarks, because he will be afforded that opportunity, provide some thoughts in regard to making sure people understand that this is illegal today and the importance of getting at those platforms. If we do not get at those platforms, the problem is not going to go away.

There was a question posed by I believe a New Democratic member asking about countries around the world. People would be surprised at the motivation used to get child pornography on the net and livestreamed. I have seen some eye-opening presentations that show that in some countries in the world the person who is putting the child on the Internet is a parent or a guardian. They do it as a way to source revenue. They do it for income for the family. How sad is that?

How angering is it to see the criminal element in North America that exploits these individuals, and children in particular. This is not to mention of course the importance of non-consensual pornography, but think of the trauma created as a direct result of a child going through things a child should never, ever have to experience. This will have a lifetime effect on that child. We know that. We see generational issues as a direct result of it.

That is the reason I like to think that every member of the House of Commons would look at the issue at hand and the principles of what we are talking about and want to take some initiative to minimize it. Members need to talk to the stakeholders. I have had the opportunity in different ways over the last number of years to do so. It is one the reasons I was very glad to see the government legislation come forward.

I was hoping to get clarification from the member on Bill C-270. He may be thrown off a little because of Bill C-63, which I believe will be of greater benefit than Bill C-270. After listening to the member speak though, I found out that the Conservative Party is apparently looking at voting against Bill C-63.

We come up with things collectively as a House to recognize important issues and put forward legislation that would have a positive impact, and I would suggest that Bill C-63 is one of those things. I would hope the member who introduced this private member's bill will not only be an advocate for his bill but be the strongest voice and advocate within his own caucus for the online harms act, Bill C-63, so we can get the support for that bill. It would literally save lives and take ungodly things off the Internet. It would save the lives of children.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:50 p.m.
See context

NDP

Randall Garrison NDP Esquimalt—Saanich—Sooke, BC

Madam Speaker, I thank the member for bringing forward this private member's bill, which directs our attention to some really important problems.

Is the member familiar with the report from the Department of Justice on cyber-bullying and non-consensual distribution of images from just a year ago, which takes quite a different approach from his bill and says we need to rewrite the existing offence so it is easier to prosecute and include measures, which are now in Bill C-63, to allow forfeiture, seizure, restitution and peace bonds in connection with these kinds of things?

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:45 p.m.
See context

Conservative

Arnold Viersen Conservative Peace River—Westlock, AB

Madam Speaker, Bill C-63 has no criminal offences around the uploading of this kind of content. In this bill, it would be a criminal offence to upload. We want to make sure this content never hits the Internet. A 24-hour takedown period is not good enough. We want to ensure that companies are doing their due diligence to ensure that their content is of people who are of age and that people consent to it.

An important piece of this bill is also that, if somebody has made a written request saying they revoke their consent, immediately that content must come down.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:45 p.m.
See context

St. Catharines Ontario

Liberal

Chris Bittle LiberalParliamentary Secretary to the Minister of Housing

Madam Speaker, the topic that the member is dealing with is particularly important. One of the arguments that he is making is with respect to taking down this heinous material online. I agree with him. However, the bill does not make any provisions for it.

Bill C-63, which is government legislation, does make provisions for taking down these types of heinous materials. The member's leader has said that he would vote against it. I wonder if the hon. member will be supporting Bill C-63 or if he is going to stick with what is here that would not accomplish the objectives that he is seeking, which I hope we would all be in favour of.

Government Responses to Order Paper QuestionsPrivilegeOral Questions

April 9th, 2024 / 3:15 p.m.
See context

Conservative

Rachael Thomas Conservative Lethbridge, AB

Mr. Speaker, I support this question of privilege in light of the violation of government's obligation to answer an Order Paper question, but I also add to it, considering how the government has taken steps to take control of the Internet in Canada.

It has done this through legislation like Bill C-11, which centralizes regulatory control of what Canadians can see, hear and post online based on what the government deems “Canadian”.

In addition, I highlight Bill C-18, which has resulted in the government being one of the biggest gatekeepers of news in Canada. This is a major conflict of interest and a direct attack on journalistic integrity in this country.

Now, most recently, through Bill C-63, the government proposes to establish an entire commission, yet another arm of the government, that would regulate online harm.

How can Canadians trust the government to police various aspects of the Internet if it cannot even be honest and tell the truth about the content requested to be taken down? Trust is pinnacle and frankly the government has not earned any of it. The truth must prevail.

Mr. Speaker, you have the opportunity to look into this and to get to the bottom of it, or you can keep us in the dark and allow secrecy and injustice to reign. I understand that you are the one to make this decision, and we are putting our trust in you to make sure that this place is upheld and democracy is kept strong.

Alleged Premature Disclosure of Bill C-63PrivilegeOral Questions

March 21st, 2024 / 3:15 p.m.
See context

Conservative

Andrew Scheer Conservative Regina—Qu'Appelle, SK

Mr. Speaker, I wanted to make a very brief intervention in response to the government House leader's parliamentary secretary's response to my question of privilege on Bill C-63 and the leak that occurred.

The parliamentary secretary's 25-minute submission extensively quoted the Internet. What it did not do, however, was explain exactly how the sources whom Travis Dhanraj and Rachel Aiello spoke to were lucky enough to state precisely which of the options the government consulted on would make it into the bill.

Had the reporting been based on the published consultation documents, the media reports would have said so, but they did not. They quoted “sources” who were “not authorized to speak publicly on the matter before the bill is tabled in Parliament.” The parliamentary secretary's implication that the sources were all stakeholders uninformed about the ways of Parliament is demonstrably untrue. CTV's source was “a senior government source”. The CBC attributed its article to “two sources, including one with the federal government”. Besides, had these sources actually all been stakeholders speaking about previous consultations, why would they have sought anonymity to begin with, let alone specify the need for anonymity, because the bill had not yet been introduced?

As I said back on February 26, the leakers knew what they were doing. They knew it was wrong, and they knew why it was wrong. We are not talking about general aspects of the bill that might have been shared with stakeholders during consultation processes. We are talking about very detailed information that was in the legislation and was leaked to the media before it was tabled in the House. That is the issue we are asking you to rule on, Mr. Speaker.

March 21st, 2024 / 9:15 a.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

My concluding remarks would be, with respect to Bill S-210 proposed by Senator Miville-Dechêne, that there are very legitimate questions that relate to privacy interests. We need to understand that age verification and age-appropriate design features are entrenched in Bill C-63, something that Monsieur Fortin seemed to misunderstand.

Second, the idea of uploading the age-verification measure such as one's government ID is something that has been roundly criticized, including by people like law enforcement, who'd be concerned about what that kind of privacy disclosure would do in terms of perpetuating financial crimes against Canadians.

What we need to be doing here is keeping Canadians safe by ensuring that their age-appropriate design measures have been informed by a conversation between law enforcement, government and the platforms themselves. There are examples of how to do this, and we're keen to work on those examples and to get this important bill into this committee so we can debate the best ways forward.

Thank you.

March 21st, 2024 / 9:05 a.m.
See context

Liberal

Élisabeth Brière Liberal Sherbrooke, QC

Thank you, Madam Chair.

Good morning, Minister. I'd like to thank you and your entire team for being with us this morning.

We are living in an increasingly divided world. Even though everyone is entitled to their own opinion, people are either for or against different issues. We are quick to put people into categories, to see them as being on one side or another and slap labels on them. In this increasingly complex world, and perhaps as my previous role taught me, I think it would help if people were more caring, attentive and open to each other.

In your opening remarks, you referred to Bill C‑63, which aims to protect children online. We have been hearing a lot about this bill. I have two questions for you.

First, do you believe that the definition of “hate speech” in Bill C‑63 will really make it possible to achieve the goal of protecting children online?

Second, the bill seems to apply pre-emptively, even before a person has said or done anything. I wonder if you could tell me your thoughts on that.

March 21st, 2024 / 9 a.m.
See context

Conservative

Rob Moore Conservative Fundy Royal, NB

Thank you, Minister.

Madam Chair, we'll have lots of time to debate Bill C‑63 in the future. I think the verdict is coming out very quickly on that. I want to use what's left of my time to now move my motion regarding former minister David Lametti on the issue of ex-judge Delisle, where the minister ordered a new trial.

I'm moving that motion now, Madam Speaker.

March 21st, 2024 / 8:55 a.m.
See context

Conservative

Rob Moore Conservative Fundy Royal, NB

Thank you, Chair.

Minister, we're here in the estimates today. You spent your entire opening remarks on a defence of Bill C-63. I recall your predecessor, Minister Lametti, when he was here. I asked him a question on the issue of MAID, when I think 25 constitutional experts said the minister's opinion on the matter was wrong. I asked the minister who was right, him or these 25 constitutional experts. And he said he was.

That kind of hubris is probably a good reason why he's not longer here and now you are, but we're starting to see that same thing on Bill C-63with yourself, when virtually everyone has come out and said this was an effort to trample down freedom of speech. Margaret Atwood described Bill C-63 as “Orwellian”. David Thomas, who was chairperson of the Canadian Human Rights Tribunal, said:

The Liberal government's proposed Bill C-63, the online harms act, is terrible law that will unduly impose restrictions on Canadians' sacred Charter right to freedom of expression. That is what the Liberals intend. By drafting a vague law creating a draconian regime to address online “harms”, they will win their wars without firing a bullet.

There's a diverse group of people who feel that Bill C-63 is an outrageous infringement on Canadians' rights. We also see a government that will not stand up for the most vulnerable.

You had the opportunity, Minister, to introduce a bill that would have protected children, but your government, true to form, could not resist taking aim at their political opponents. This is not about hate speech, it's about speech that Liberals hate, and shutting that down.

Now Bill C-63, if it unfortunately were to pass, will too be struck down by the courts. If you were in a position to appeal it, I have no doubt you would. That brings me to my question on your government's radical agenda.

You've decided to file a number of appeals in recent court rulings. You've appealed a ruling that found the invocation of the Emergencies Act was unconstitutional. You appealed a ruling that found that the plastic bag ban and the plastic straw ban that Canadians hate so much was unconstitutional. You were quick to appeal those. But when the Supreme Court ruled the six-month minimum sentence for the crime of child luring was unconstitutional, you chose not to file an appeal.

Why is it that, when your government's radical agenda is challenged in the courts, you're quick to appeal, but when vulnerable Canadians' lives are at stake, you choose not to appeal?

March 21st, 2024 / 8:50 a.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Thank you, Mr. Garrison, for your leadership on the first part of what you talked about and the courage that you continue to show as a parliamentarian, and also for your leadership and that of Laurel Collins on coercive control.

In terms of supporting victims, we are constantly and actively thinking about how to better support victims, including victims of intimate partner violence. Please take a cue from what we did in Bill C-75 and in Bill C-48 with respect to the reverse onus on bail for survivors of intimate partner violence. Issues about support and funding are always on the table.

Also, please understand that when you talk about a 24-hour takedown of things like revenge porn, you're dealing with an aspect of coercive control that exists right now. That's in Bill C-63.

You also mentioned, in your opening, hearing from voices. I think two of the most salient voices that I heard from were the two that were at the press conference with me: Jane, the mother of a child who has been sexually abused and repeatedly exploited online, and Carla Beauvais, a woman who has been intimidated and has retreated from participating in the public space.

I would also suggest taking your cues from the groups that were also there beside me. The National Council of Canadian Muslims and the Centre for Israel and Jewish Affairs have, in the last six months, not seen eye to eye on a lot of issues. On this bill, they do see eye to eye. They both support this, as do the special envoys on anti-Semitism and Islamophobia. Those are important voices to be hearing from, and that's what I will continue to do.

March 21st, 2024 / 8:50 a.m.
See context

NDP

Randall Garrison NDP Esquimalt—Saanich—Sooke, BC

No, I won't.

I want to thank the minister for his very clear presentation on Bill C-63.

I want to add two things to this discussion. One is that the loudest voices on this bill often do not include those who are most likely to be subjected to hate crime campaigns. When it comes before this committee, I'm looking forward to a diversity of witnesses who can talk about the real-world impacts that online hate has. We've seen it again and again. It's often well organized.

I stood outside the House of Commons and defended the rights of trans kids. Within one day, I had 700 emails with the same defamatory and hateful two-word phrase used to describe me. I am a privileged person. I have a staff. I have all the resources and support I need. However, when you think about what happens to trans kids and their families when they are subjected to these online hate crimes, it has very real consequences.

I'm looking forward to us being able to hear from diverse voices and, in particular, those who are most impacted. I know this is not really a question to you at this point.

We have other important work we've been doing in this committee. I want to turn to Bill C-332, which just passed this committee and was sent back to the House. This is the bill on controlling and coercive behaviour. This committee has been dealing with this topic for more than three years. One of the things that we quite clearly said was that the passage of this bill is a tool for dealing with the epidemic of intimate partner violence, but it's not the only tool.

I guess I'm asking two things here.

What other plans does the Department of Justice have to provide the necessary and associated supports for survivors of intimate partner violence?

What plans are there to do the educational work that will be necessary?

The bill says it will be proclaimed at a time chosen by cabinet. I'm assuming there will be a plan to get ready for this. I'm interested in what's going to happen with that plan. It has unanimous support, so I don't think it's premature to be asking about this at this point.

March 21st, 2024 / 8:50 a.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

The point I want to make about Bill S‑210 is that Bill C‑63 already contains age verification mechanisms. Furthermore, we must always protect the privacy rights of Canadians. In other words—

March 21st, 2024 / 8:45 a.m.
See context

Bloc

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Chair.

Thank you for being here, Minister.

I have several questions running through my head, but I'll have to prioritize them. I wish I had more time, but I understand that's the way it has to be done.

First, I have some questions about the legal aid system for immigrants and refugees. I'm sure you understand that this issue is of great concern to the Bloc Québécois. In Quebec, the amount owed by the federal government is a problem. In fact, the Quebec government is not getting paid, yet it continues to spend on newcomers.

There's also the question of official languages. A total of $1.2 million has been earmarked for official languages and I'm interested in hearing how that money will be distributed among the provinces.

In addition, there's obviously the whole issue of systemic racism. You want to help judges impose sentences that take this into account. How is that going to work? How are we going to define systemic racism?

There's the question of cybersecurity, in courthouses, etc.

There are plenty of important issues, essential even, that I won't necessarily be able to address this morning, unfortunately. However, I will try.

There's also Bill C‑63, which you told us about in your opening remarks. I'm not sure how it relates to the Supplementary Estimates (C), but it is an important question, regardless. With respect to this bill, I am curious as to why you didn't introduce the age verification process, as proposed by Senator Julie Miville-Dechêne. Her proposal seemed relatively wise to me, but there's no mention of it at all in Bill C‑63.

The Bloc Québécois is in the same boat. We've proposed abolishing the two religious exceptions in the Criminal Code, which I think is essential in the current context. How is it possible that someone can still build their defence around the idea that they committed a hate crime or spread hatred because of a religious text? That is completely absurd and contrary to the values shared by all Quebeckers and, I'm certain, by the rest of Canada too.

These are all essential questions, but I'm going to focus on two important elements.

First, our committee recently passed a bill that aims to create a commission to review errors in the justice system. This is obviously something that had to be done; congratulations. I think it was high time for a major clean‑up. The commission will comprise nine members. I've tabled an amendment to the effect that these nine commissioners should be bilingual. In fact, I'm a little surprised that this wasn't planned from the outset. Still, it seems a very modest goal. Nine bilingual commissioners across Canada shouldn't be too hard to achieve. However, I've run into an objection from some of my colleagues, including one of your Liberal colleagues.

I'd like to hear your thoughts on this. If we want the justice system to be bilingual, shouldn't we necessarily make an effort by asking for bilingualism among these nine commissioners? It's not as though there are 900 of them; there are nine.

March 21st, 2024 / 8:20 a.m.
See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

Thank you, Chair, and members of the Committee.

Thank you for inviting me to join you today.

I would like to begin by acknowledging that we are meeting on the traditional unceded territory of the Algonquin Anishinaabe Nation.

As I am sure you have seen, a few weeks ago, I introduced Bill C‑63, the Online Harms Act. I want to both explain the vital importance of the Online Harms Act and dispel misunderstandings about what it does and doesn't do.

The premise of this legislation is simple: we all expect to be safe in our homes, neighbourhoods and communities. We should be able to expect the same kind of security in our online communities. We need to address the online harms that threaten us, and especially our children, every day.

Let me start by talking about our children.

There are currently no safety standards mandated for the online platforms that kids use every day. In contrast, my children's LEGO in our basement is subject to rigorous safety standards and testing before my two boys get their hands on it. I know that these days my children spend much more time online than playing with their LEGO. The most dangerous toys in my home right now and in every Canadian home are the screens our children are on. Social media is everywhere. It brings unchecked dangers and horrific content. This, frankly, terrifies me. We need to make the Internet safe for our young people around the country.

As parents, one of the first things we teach all of our kids is how to cross the road. We tell them to wait for the green light. We tell them to look in both directions. We trust our children, but we also have faith in the rules of the road and that drivers will respect the rules of the road. We trust that cars will stop at a red light and obey the speed limit. Safety depends on a basic network of trust. This is exactly what we are desperately lacking in the digital world. The proposed online harms act would establish rules of the road for platforms so that we can teach our kids to be safe online, with the knowledge that platforms are also doing their part.

Now, let's talk about hate crimes.

The total number of police-reported hate crimes in Canada has reached its highest level on record, nearly doubling the rate recorded in 2019.

Police across the country are calling the increase “staggering”. Toronto Police Chief Myron Demkiw said this week that hate crime calls in Toronto have increased by 93% since last October. Communities and law enforcement have been calling on governments to act.

Bill C-63 creates a new stand-alone hate crime offence to make sure that hate crimes are properly prosecuted and identified. Under our current legal system, hate motivation for a crime is only considered as an afterthought at the sentencing stage; it is not part of the offence-laying itself. The threshold for criminal hatred is high. Comments that offend, humiliate or insult do not hit the standard of hatred. They are what we call awful but lawful. The definition of hate that we are embedding in the Criminal Code comes straight from the Supreme Court of Canada in the Keegstra and Whatcott decisions. We did not make up the definition of hatred that we are proposing.

It has been disappointing, though not surprising, to see the wildly inaccurate assertions made by some commentators about how sentencing for this new hate crime provision would work. I have heard some claim that, under this provision, someone who commits an offence under the National Parks Act would now be subject to a life sentence. That is simply false.

In Canada, judges impose sentences following sentencing ranges established through past decisions. Judges are required by law—and every member of this committee who is a lawyer will know this—to impose sentences that are proportionate to the offence committed. In other words, the punishment must always fit the crime. If judges impose sentences that are unfit, we have appeal courts that can overturn those sentences.

You may be asking, “Well, why not specify that, Minister? Why put a maximum sentence of life in the new hate crime offence-laying provision?”

Let me explain.

First, it's important to remember that a maximum sentence is not an average sentence; it's an absolute ceiling.

Second, the new hate crime offence captures any existing offence if it was hate-motivated. That can run the gamut from a hate-motivated theft all the way to a hate-motivated attempted murder. The sentencing range entrenched in Bill C-63 was designed to mirror the existing sentencing options for all of these potential underlying offences, from the most minor to the most serious offences on the books, such as attempted murder, which can attract, right now, a life sentence.

This does not mean that minor offences will suddenly receive extremely harsh sentences. This would violate all the legal principles that sentencing judges are required to follow. Hate-motivated murder will result in a life sentence. A minor infraction will certainly not result in it.

Another criticism I have heard is that this bill could stifle freedom of expression. This is simply not true. On the contrary, this bill strengthens freedom of expression. There are people in Canada who cannot speak out because they legitimately fear for their safety. When they speak out, they are mistreated and subjected to truly despicable threats, intimidation and harassment.

This is carefully balanced. We consulted. We looked abroad.

We do not automatically take down material within 24 hours except for child sexual abuse material or revenge pornography. We do not touch private communications. We do not affect individual websites that do not host user-generated content.

This bill protects children and gives everyone the tools they need to protect themselves online. We do not tolerate hate speech in the public square. Nor must we tolerate hate speech online.

We have seen the consequences of unchecked online hate and child sexual exploitation. Ask the families of the six people killed at the Quebec City mosque by someone who was radicalized online.

Ask the young boy orphaned by the horrific attack on four members of the Afzaal family in London, Ontario. Ask the parents of young people right across this country who have taken their own lives after being sextorted by online predators.

Finally, let me set the record straight on the peace bond provision in Bill C-63. Peace bonds are not house arrests. Peace bonds are not punishments. Peace bonds are well-established tools used to impose individually tailored conditions on someone when there is credible evidence to show that they may hurt someone or commit a crime. The proposed peace bond here would operate very similarly to existing peace bonds.

As an example, if someone posts online about their plan to deface or attack a synagogue to intimidate the Jewish community, members of the synagogue could take this information to the police and the court. They could seek to have a peace bond imposed after obtaining consent from the provincial attorney general. Decades of case law tell us that conditions must be reasonable and linked to the specific threat. Here conditions imposed on the person could include staying 100 metres away from that synagogue for a period of 12 months. If the person breached that simple condition, they could be arrested. If they abided by the conditions, they would face no consequences.

I ask you this: Why should members of that synagogue, when facing a credible threat of being targeted by a hate-motivated crime, have to wait to be attacked or to have a swastika graffitied on the front door before we act to help them? If we can prevent some attacks from happening, isn't that much better? Peace bonds are not perfect, but we believe they can be a valuable tool to keep people safe. In the face of rising hate crime, our government believes that doing nothing in an instance like this would be irresponsible.

I think that's what explains both CIJA's and the special envoy on anti-Semitism's support of Bill C-63.

As always, I am open to good faith suggestions to improve this legislation. My goal is to get it right. I look forward to debating the Online Harms Act in the House of Commons and following the committee's process as it reaches that stage. I am convinced that we all have the same goal here: we need to create a safe online world, especially for the most vulnerable members of our society—our children.

Thank you for your time.

I'm happy to take your questions.

Alleged Premature Disclosure of Bill C-63PrivilegeGovernment Orders

March 19th, 2024 / 5:15 p.m.
See context

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Mr. Speaker, I am rising to respond to a question of privilege raised by the member for Regina—Qu'Appelle on February 26 regarding the alleged premature disclosure of the content of Bill C-63, the online harms act.

I would like to begin by stating that the member is incorrect in asserting that there has been a leak of the legislation, and I will outline a comprehensive process of consultation and information being in the public domain on this issue long before the bill was placed on notice.

Online harms legislation is something that the government has been talking about for years. In 2015, the government promised to make ministerial mandate letters public, a significant departure from the secrecy around those key policy commitment documents from previous governments. As a result of the publication of the mandate letters, reporters are able to use the language from these letters to try to telegraph what the government bill on notice may contain.

In the 2021 Liberal election platform entitled “Forward. For Everyone.”, the party committed to the following:

Introduce legislation within its first 100 days to combat serious forms of harmful online content, specifically hate speech, terrorist content, content that incites violence, child sexual abuse material and the non-consensual distribution of intimate images. This would make sure that social media platforms and other online services are held accountable for the content that they host. Our legislation will recognize the importance of freedom of expression for all Canadians and will take a balanced and targeted approach to tackle extreme and harmful speech.

Strengthen the Canada Human Rights Act and the Criminal Code to more effectively combat online hate.

The December 16, 2021, mandate letter from the Prime Minister to the Minister of Justice and Attorney General of Canada asked the minister to achieve results for Canadians by delivering on the following commitment:

Continue efforts with the Minister of Canadian Heritage to develop and introduce legislation as soon as possible to combat serious forms of harmful online content to protect Canadians and hold social media platforms and other online services accountable for the content they host, including by strengthening the Canadian Human Rights Act and the Criminal Code to more effectively combat online hate and reintroduce measures to strengthen hate speech provisions, including the re-enactment of the former Section 13 provision. This legislation should be reflective of the feedback received during the recent consultations.

Furthermore, the December 16, 2021, mandate letter from the Prime Minister to the Minister of Canadian Heritage also asked the minister to achieve results for Canadians by delivering on the following commitment:

Continue efforts with the Minister of Justice and Attorney General of Canada to develop and introduce legislation as soon as possible to combat serious forms of harmful online content to protect Canadians and hold social media platforms and other online services accountable for the content they host. This legislation should be reflective of the feedback received during the recent consultations.

As we can see, the government publicly stated its intention to move ahead with online harms legislation, provided information on its plan and consulted widely on the proposal long before any bill was placed on the Notice Paper.

I will now draw to the attention of the House just how broadly the government has consulted on proposed online harms legislation.

Firstly, with regard to online consultations, from July 29 to September 25, 2021, the government published a proposed approach to address harmful content online for consultation and feedback. Two documents were presented for consultation: a discussion guide that summarized and outlined an overall approach, and a technical paper that summarized drafting instructions that could inform legislation.

I think it is worth repeating here that the government published a technical paper with the proposed framework for this legislation back in July 2021. This technical paper outlined the categories of proposed regulated harmful content; it addressed the establishment of a digital safety commissioner, a digital safety commission, regulatory powers and enforcement, etc.

Second is the round table on online safety. From July to November 2022, the Minister of Canadian Heritage conducted 19 virtual and in-person round tables across the country on the key elements of a legislative and regulatory framework on online safety. Virtual sessions were also held on the following topics: anti-Semitism, Islamophobia, anti-Black racism, anti-Asian racism, women and gender-based violence, and the tech industry.

Participants received an information document in advance of each session to prepare for the discussion. This document sought comments on the advice from the expert advisory group on online safety, which concluded its meetings on June 10. The feedback gathered from participants touched upon several key areas related to online safety.

Third is the citizens' assembly on democratic expression. The Department of Canadian Heritage, through the digital citizen initiative, is providing financial support to the Public Policy Forum's digital democracy project, which brings together academics, civil society and policy professionals to support research and policy development on disinformation and online harms. One component of this multi-year project is an annual citizens' assembly on democratic expression, which considers the impacts of digital technologies on Canadian society.

The assembly took place between June 15 and 19, 2023, in Ottawa, and focused on online safety. Participants heard views from a representative group of citizens on the core elements of a successful legislative and regulatory framework for online safety.

Furthermore, in March 2022, the government established an expert advisory group on online safety, mandated to provide advice to the Minister of Canadian Heritage on how to design the legislative and regulatory framework to address harmful content online and how to best incorporate the feedback received during the national consultation held from July to September 2021.

The expert advisory group, composed of 12 individuals, participated in 10 weekly workshops on the components of a legislative and regulatory framework for online safety. These included an introductory workshop and a summary concluding workshop.

The government undertook its work with the expert advisory group in an open and transparent manner. A Government of Canada web page, entitled “The Government's commitment to address online safety”, has been online for more than a year. It outlines all of this in great detail.

I now want to address the specific areas that the opposition House leader raised in his intervention. The member pointed to a quote from a CBC report referencing the intention to create a new regulator that would hold online platforms accountable for harmful content they host. The same website that I just referenced states the following: “The Government of Canada is committed to putting in place a transparent and accountable regulatory framework for online safety in Canada. Now, more than ever, online services must be held responsible for addressing harmful content on their platforms and creating a safe online space that protects all Canadians.”

Again, this website has been online for more than a year, long before the bill was actually placed on notice. The creation of a regulator to hold online services to account is something the government has been talking about, consulting on and committing to for a long period of time.

The member further cites a CBC article that talks about a new regulatory body to oversee a digital safety office. I would draw to the attention of the House the “Summary of Session Four: Regulatory Powers” of the expert advisory group on online safety, which states:

There was consensus on the need for a regulatory body, which could be in the form of a Digital Safety Commissioner. Experts agreed that the Commissioner should have audit powers, powers to inspect, have the powers to administer financial penalties and the powers to launch investigations to seek compliance if a systems-based approach is taken—but views differed on the extent of these powers. A few mentioned that it would be important to think about what would be practical and achievable for the role of the Commissioner. Some indicated they were reluctant to give too much power to the Commissioner, but others noted that the regulator would need to have “teeth” to force compliance.

This web page has been online for months.

I also reject the premise of what the member for Regina—Qu'Appelle stated when quoting the CBC story in question as it relates to the claim that the bill will be modelled on the European Union's Digital Services Act. This legislation is a made-in-Canada approach. The European Union model regulates more than social media and targets the marketplace and sellers. It also covers election disinformation and certain targeted ads, which our online harms legislation does not.

The member also referenced a CTV story regarding the types of online harms that the legislation would target. I would refer to the 2021 Liberal election platform, which contained the following areas as targets for the proposed legislation: “hate speech, terrorist content, content that incites violence, child sexual abuse material and the non-consensual distribution of intimate images.” These five items were the subject of the broad-based and extensive consultations I referenced earlier in my intervention.

Based on these consultations, a further two were added to the list to be considered. I would draw the attention of the House to an excerpt from the consultation entitled, “What We Heard: The Government’s proposed approach to address harmful content online”, which states, “Participants also suggested the inclusion of deep fake technology in online safety legislation”. It continues, “Many noted how child pornography and cyber blackmailing can originate from outside of Canada. Participants expressed frustration over the lack of recourse and tools available to victims to handle such instances and mentioned the need for a collaborative international effort to address online safety.”

It goes on to state:

Some respondents appreciated the proposal going beyond the Criminal Code definitions for certain types of content. They supported the decision to include material relating to child sexual exploitation in the definition that might not constitute a criminal offence, but which would nevertheless significantly harm children. A few stakeholders said that the proposal did not go far enough and that legislation could be broader by capturing content such as images of labour exploitation and domestic servitude of children. Support was also voiced for a concept of non-consensual sharing of intimate images.

It also notes:

A few respondents stated that additional types of content, such as doxing (i.e., the non-consensual disclosure of an individual’s private information), disinformation, bullying, harassment, defamation, conspiracy theories and illicit online opioid sales should also be captured by the legislative and regulatory framework.

This document has been online for more than a year.

I would also point to the expert advisory group's “Concluding Workshop Summary” web page, which states:

They emphasized the importance of preventing the same copies of some videos, like live-streamed atrocities, and child sexual abuse, from being shared again. Experts stressed that many file sharing services allow content to spread very quickly.

It goes on to say:

Experts emphasized that particularly egregious content like child sexual exploitation content would require its own solution. They explained that the equities associated with the removal of child pornography are different than other kinds of content, in that context simply does not matter with such material. In comparison, other types of content like hate speech may enjoy Charter protection in certain contexts. Some experts explained that a takedown obligation with a specific timeframe would make the most sense for child sexual exploitation content.

It also notes:

Experts disagreed on the usefulness of the five categories of harmful content previously identified in the Government’s 2021 proposal. These five categories include hate speech, terrorist content, incitement to violence, child sexual exploitation, and the non-consensual sharing of intimate images.

Another point is as follows:

A few participants pointed out how the anonymous nature of social media gives users more freedom to spread online harm such as bullying, death threats and online hate. A few participants noted that this can cause greater strain on the mental health of youth and could contribute to a feeling of loneliness, which, if unchecked, could lead to self-harm.

Again, this web page has been online for more than a year.

The member further cites the CTV article's reference to a new digital safety ombudsperson. I would point to the web page of the expert advisory group for the “Summary of Session Four: Regulatory Powers”, which states:

The Expert Group discussed the idea of an Ombudsperson and how it could relate to a Digital Safety Commissioner. Experts proposed that an Ombudsperson could be more focused on individual complaints ex post, should users not be satisfied with how a given service was responding to their concerns, flags and/or complaints. In this scheme, the Commissioner would assume the role of the regulator ex ante, with a mandate devoted to oversight and enforcement powers. Many argued that an Ombudsperson role should be embedded in the Commissioner’s office, and that information sharing between these functions would be useful. A few experts noted that the term “Ombudsperson” would be recognizable across the country as it is a common term and [has] meaning across other regimes in Canada.

It was mentioned that the Ombudsperson could play more of an adjudicative role, as distinguished from...the Commissioner’s oversight role, and would have some authority to have certain content removed off of platforms. Some experts noted that this would provide a level of comfort to victims. A few experts raised questions about where the line would be drawn between a private complaint and resolution versus the need for public authorities to be involved.

That web page has been online for months.

Additionally, during the round table on online safety and anti-Black racism, as the following summary states:

Participants were supportive of establishing a digital safety ombudsperson to hold social media platforms accountable and to be a venue for victims to report online harms. It was suggested the ombudsperson could act as a body that takes in victim complaints and works with the corresponding platform or governmental body to resolve the complaint. Some participants expressed concern over the ombudsperson's ability to process and respond to user complaints in a timely manner. To ensure the effectiveness of the ombudsperson, participants believe the body needs to have enough resources to keep pace with the complaints it receives. A few participants also noted the importance for the ombudsperson to be trained in cultural nuances to understand the cultural contexts behind content that is reported to them.

That web page has been online for more than a year.

Finally, I would draw the attention of the House to a Canadian Press article of February 21, 2024, which states, “The upcoming legislation is now expected to pave the way for a new ombudsperson to field public concerns about online content, as well as a new regulatory role that would oversee the conduct of internet platforms.” This appeared online before the bill was placed on notice.

Mr. Speaker, as your predecessor reiterated in his ruling on March 9, 2021, “it is a recognized principle that the House must be the first to learn the details of new legislative measures.” He went on to say, “...when the Chair is called on to determine whether there is a prima facie case of privilege, it must take into consideration the extent to which a member was hampered in performing their parliamentary functions and whether the alleged facts are an offence against the dignity of Parliament.” The Chair also indicated:

When it is determined that there is a prima facie case of privilege, the usual work of the House is immediately set aside in order to debate the question of privilege and decide on the response. Given the serious consequences for proceedings, it is not enough to say that the breach of privilege or contempt may have occurred, nor to cite precedence in the matter while implying that the government is presumably in the habit of acting in this way. The allegations must be clear and convincing for the Chair.

The government understands and respects the well-established practice that members have a right of first access to the legislation. It is clear that the government has been talking about and consulting widely on its plan to introduce online harms legislation for the past two years. As I have demonstrated, the public consultations have been wide-ranging and in-depth with documents and technical papers provided. All of this occurred prior to the bill's being placed on notice.

Some of the information provided by the member for Regina—Qu'Appelle is not even in the bill, most notably the reference to its being modelled on the European Union's Digital Services Act, which is simply false, as I have clearly demonstrated. The member also hangs his arguments on the usage of the vernacular “not authorized to speak publicly” in the media reports he cites. It is certainly not proof of a leak, especially when the government consulted widely and publicly released details on the content of the legislative proposal for years before any bill was actually placed on notice.

The development of the legislation has been characterized by open, public and wide-ranging consultations with specific proposals consulted on. This is how the Leader of the Opposition was able to proclaim, on February 21, before the bill was even placed on notice, that he and his party were vehemently opposed to the bill. He was able to make this statement because of the public consultation and the information that the government has shared about its plan over the last two years. I want to be clear that the government did not share the bill before it was introduced in the House, and the evidence demonstrates that there was no premature disclosure of the bill.

I would submit to the House that consulting Canadians this widely is a healthy way to produce legislation and that the evidence I have presented clearly demonstrates that there is no prima facie question of privilege. It is our view that this does not give way for the Chair to conclude that there was a breach of privilege of the House nor to give the matter precedence over all other business of the House.

Alleged Premature Disclosure of Bill C-63PrivilegeGovernment Orders

February 26th, 2024 / 5:15 p.m.
See context

Conservative

Andrew Scheer Conservative Regina—Qu'Appelle, SK

Mr. Speaker, I am rising this afternoon on a question of privilege concerning the leak of key details of Bill C-63, the so-called online harms bill, which was tabled in the House earlier today.

While a lot will be said in the days, weeks and months ahead about the bill in the House, its parliamentary journey is not off to a good start. Yesterday afternoon, the CBC published on its website an article entitled “Ottawa to create regulator to hold online platforms accountable for harmful content: sources”. The article, written by Naama Weingarten and Travis Dhanraj, outlined several aspects of the bill with the information attributed to two sources “with knowledge of Monday's legislation”.

I will read brief excerpts of the CBC's report revealing details of the bill before it was tabled in Parliament.

“The Online Harms Act, expected to be introduced by the federal government on Monday, will include the creation of a new regulator that would hold online platforms accountable for harmful content they host, CBC News has confirmed.”

“The new regulatory body is expected to oversee a digital safety office with the mandate of reducing online harm and will be separate from the Canadian Radio-television and Telecommunications Commission (CRTC), sources say.”

“Sources say some components of the new bill will be modelled on the European Union's Digital Services Act. According to the European Commission, its act “regulates online intermediaries and platforms such as marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms.””

Then, today, CTV News published a second report entitled “Justice Minister to Introduce New Bill to Tackle Harmful Online Content”. In Rachel Aiello's article, she says, “According to a senior government source [Bill C-63] would be expected to put an emphasis on harms to youth including specific child protection obligations for social media and other online platforms, including enhanced preservation requirements. It targets seven types of online harms: hate speech, terrorist content, incitement to violence, the sharing of non-consensual intimate images, child exploitation, cyberbullying, and inciting self-harm, and includes measures to crack down on non-consensual artificial intelligence pornography, deepfakes and require takedown provisions for what's become known as 'revenge porn'. Further, while the sources suggested there will be no new powers for law enforcement, multiple reports have indicated the bill will propose creating a new digital safety ombudsperson to field Canadians' concerns about platform decisions around content moderation.”

As explained in footnote 125 on page 84 of the House of Commons Procedure and Practice, third edition, on March 19, 2001: “Speaker Milliken ruled that the provision of information concerning legislation to the media without any effective measures to secure the rights of the House constituted a prima facie case of contempt.”

The subsequent report of the Standing Committee on Procedure and House Affairs concluded: “This case should serve as a warning that our House will insist on the full recognition of its constitutional function and historic privileges across the full spectrum of government.”

Sadly, Mr. Speaker, the warning has had to be sounded multiple times since. Following rulings by your predecessors finding similar prima facie contempts on October 15, 2001, April 19, 2016 and March 10, 2020, not to mention several other close-call rulings that fell short of the necessary threshold yet saw the Chair sound cautionary notes for future reference, a number of those close-call rulings occurred under the present government that would often answer questions of privilege with claims that no one could be certain who had leaked the bill or even when it had been leaked, citing advanced policy consultations with stakeholders.

Mr. Speaker, your immediate predecessor explained, on March 10, 2020, on page 1,892 of the Debates, the balancing act that must be observed. He said:

The rule on the confidentiality of bills on notice exists to ensure that members, in their role as legislators, are the first to know their content when they are introduced. Although it is completely legitimate to carry out consultations when developing a bill or to announce one’s intention to introduce a bill by referring to its public title available on the Notice Paper and Order Paper, it is forbidden to reveal specific measures contained in a bill at the time it is put on notice.

In the present circumstances, no such defence about stakeholders talking about their consultations can be offered. The two sources the CBC relied upon for its reporting were, according to the CBC itself, granted anonymity “because they were not authorized to speak publicly on the matter before the bill is tabled in Parliament.”

As for the CTV report, its senior government source “was not authorized to speak publicly about details yet to be made public.”

When similar comments were made by the Canadian Press in its report on the leak of the former Bill C-7 respecting medical assistance in dying, Mr. Speaker, your immediate predecessor had this to say when finding a prima facie contempt in his March 10, 2020 ruling:

Everything indicates that the act was deliberate. It is difficult to posit a misunderstanding or ignorance of the rules in this case.

Just as in 2020, the leakers knew what they were doing. They knew it was wrong and they knew why it was wrong. The House must stand up for its rights, especially against a government that appears happy to trample over them in the pursuit of legislating the curtailing of Canadians' rights.

Mr. Speaker, if you agree with me that there is a prima facie contempt, I am prepared to move the appropriate motion.

Online Harms ActRoutine Proceedings

February 26th, 2024 / 3:25 p.m.
See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

moved for leave to introduce Bill C-63, An Act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service, and to make consequential and related amendments to other acts.

(Motions deemed adopted, bill read the first time and printed)