An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of June 7, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Diversity and InclusionOral Questions

June 18th, 2024 / 2:30 p.m.
See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

Mr. Speaker, I welcome the Leader of the Opposition recognizing the divisive rhetoric and the division that is occurring in Canadian society right now. We have a problem with hatred. We have to address that problem. We know that the statistics show that hate crimes are on the rise 130% in the last five years.

That is why I was proud to stand with CIJA when we tabled Bill C-63, the online harms legislation that would improve penalties for hate crimes, provide a definition of hatred and ensure that we are keeping Canadian communities safe. The special envoy on anti-Semitism supports the bill. CIJA supports the bill. I am just wondering why the Leader of the Opposition does not.

June 13th, 2024 / 5:20 p.m.
See context

Liberal

Patricia Lattanzio Liberal Saint-Léonard—Saint-Michel, QC

In your introduction, Professor Daum Shanks, you spoke about the difficulty for victims to get specific harms recognized as such, as they're not covered by the existing jurisprudence. I'm curious to know how the digital safety commission and the digital safety ombudsperson proposed in Bill C-63 would help us better support victims and make sure they have their voices heard.

June 13th, 2024 / 5:15 p.m.
See context

Liberal

Patricia Lattanzio Liberal Saint-Léonard—Saint-Michel, QC

Thank you, Madam Chair.

My question will be for Professor Daum Shanks.

Professor, are you familiar with the proposed changes in Bill C-63 with regard to the mandatory reporting act?

June 13th, 2024 / 4:50 p.m.
See context

Liberal

Patricia Lattanzio Liberal Saint-Léonard—Saint-Michel, QC

Thank you, Professor Daum Shanks.

In your presentation, you mentioned making the two observations. You just discussed the first one. I'm going to go with the responsibility.

How is Bill C-63 addressing the notion of responsibility?

June 13th, 2024 / 4:45 p.m.
See context

Liberal

Taleeb Noormohamed Liberal Vancouver Granville, BC

Thank you so much to the witnesses for being here.

I'm going to split my time with my colleague, Ms. Lattanzio.

One of the things that we have sought to do in addressing some of the concerns related to online harms, and in particular some of the issues that have been raised during the course of the study, are making sure that our legislation, Bill C-63, the online harms bill, takes on some of these challenges head-on and works.... As we have said, we are willing to work with all parties to ensure that it's the best possible bill out there.

I don't know if you had a chance to follow the deliberations of our meeting on Tuesday, but our colleague, Mrs. Thomas, raised what I would argue is a very important concern in a number of her questions. It would appear, at least from my read of it, that she was advocating—I don't want to put words in her mouth, but this is the way that I understood what she was suggesting—that we take a risk-based approach in the legislating of regulations around social media platforms. Our belief is that this is exactly what Bill C-63 proposes to do.

Do you agree, first of all, with Bill C-63 and what we're trying to do, and that the right approach to legislating regulations around social media platforms is really to take a risk-based approach, as suggested by Mrs. Thomas and others?

I would refer that question to you, Professor.

June 13th, 2024 / 4:45 p.m.
See context

Lawyer, As an Individual

Keita Szemok-Uto

I would agree that potentially more could be done than what Bill C-63 presents.

I do note, at least, that there is the inclusion of the word “deepfake” in the legislation. If anything, I think it's a good step forward in trying to tackle and get up to date with the definitions that are being used.

I note that legislation that was recently passed in Pennsylvania is criminal legislation that prohibits the dissemination of an artificially generated sexual depictions of individuals. I won't read the definition out in full. It's quite broad.

Bill C-63 does refer to deepfakes, at least, though no definition is provided. I think in that respect, broadening the terminology.... As I said, the privacy law torts are restricted by their definitions and terminology to not include this new problem that we're trying to deal with. To the extent that it gets the definitions more up to date, I think it is a step in the right direction.

June 13th, 2024 / 4:45 p.m.
See context

Conservative

Kevin Waugh Conservative Saskatoon—Grasswood, SK

The problem with having an extrajudicial bureaucratic arm, which I think Bill C-63 is, is that it can actually perpetuate harm.

We can see that because victims—some have mentioned it here today—have to come bravely forward and share their stories with a commissioner or a body that they really don't know. That has actually led to no real power. I think the victim in this case is led to hope and then hope is deferred because there are no real teeth in the legislation.

What are your thoughts on that?

I do think Bill C-63 is a bureaucratic nightmare ready to explode if it does get through the House.

June 13th, 2024 / 4:40 p.m.
See context

Conservative

Kevin Waugh Conservative Saskatoon—Grasswood, SK

Thank you, Madam Chair.

Welcome to everybody here this afternoon.

Mr. Szemok-Uto, you're recently a lawyer. Thank you for that.

As you know, Bill C-63 did not expand to include the Criminal Code. It's not been updated to include deepfakes. That seems to be a concern not only around this table, but everywhere.

I don't know why, when you introduce a bill, you would not.... We've seen that since 2017 deepfakes are accelerating around the world, yet it's not in this bill. What good is the bill when you don't talk about deepfakes?

What are your thoughts?

June 13th, 2024 / 4:35 p.m.
See context

General Counsel, Canadian Centre for Child Protection Inc.

Monique St. Germain

Education is always a critical component of any policy or initiative that we have. We have lots of different tools in the tool box. We have our criminal law. We have potentially Bill C-63 coming forward to provide some regulatory.... On the education side, obviously it's ensuring that young people are educated about sexual consent and understand the ramifications of sharing intimate images or creating this type of material, etc.

We can't lose sight of the fact that the reason they can do these things is because of the platforms that facilitate this kind of harm. We should have education for parents and for kids, taught through schools and available in a lot of different mediums and the places that kids go. While we can have education, we also need to make sure that we don't lose sight of the fact that there are other actors in the ecosystem who are contributing to the harm that we're all seeing.

June 13th, 2024 / 4:30 p.m.
See context

Bloc

Martin Champoux Bloc Drummond, QC

This type of content is easy to produce. There are even applications for that. It's quite appalling.

People are talking a great deal about Bill C‑63, which seeks to regulate hateful and inappropriate content online.

Beyond legislation, do you feel that the platforms could do more about this?

Do you think they are now able to do more technologically, contrary to what they claim?

June 13th, 2024 / 4:25 p.m.
See context

Associate Professor, University of British Columbia, As an Individual

Dr. Heidi Tworek

We didn't specifically discuss generative AI that much within our group, but I think that within Bill C-63 there's certainly at least an attention to the question of deepfakes. I think there's a concept of a duty to act responsibly that's certainly capacious enough to be able to deal with these kinds of updates. If we're thinking about generative AI companies, they too will have a duty to act responsibly and then I think the question becomes, what exactly should that duty to act responsibly look like in the case of generative AI? A lot of the things we've been talking about today would obviously be a very central part of that.

June 13th, 2024 / 4:25 p.m.
See context

Associate Professor, University of British Columbia, As an Individual

Dr. Heidi Tworek

Thank you very much.

I served with Mr. Krishnamurthy on the expert advisory group. This is something we grappled with quite a lot. Of course, major platforms like Facebook and so on have many employees and can easily staff up, but we often see these harms, particularly now with generative AI lowering the barrier to entry, that could be a couple of individuals who create complete havoc or very small firms.

I think there are two aspects to this question. One is the very important question of international co-operation on this. We've talked as if all of the individuals creating harm would be located in Canada, but the truth is that many of them may be located outside of Canada. I think we need to think about what international co-operation looks like. We have this for counterterrorism in the online space, and we need to think about this for deepfakes.

In the case of smaller companies, we can divide between those whom I think are being abused and then the question of how the new proposed online bill, Bill C-63, could have a digital safety commissioner who actually helps those smaller firms to ensure that these deepfakes are removed.

Finally, we have the question of the more nefarious smaller-firm actors and whether we need to have Bill C-63 expanded to be able to be nimble and shut down those kinds of nefarious actors more quickly—or, for example, tools that are only really being put up in order to create deepfakes of the terrible kinds that have been described by other witnesses.

I would just emphasize that the international co-operation, finally, is key. Taking things down in Canada only will potentially lead to revictimization, as something might be stored in a server in another country and then continually reuploaded.

June 13th, 2024 / 4:20 p.m.
See context

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you.

Bill C-63 does something very interesting. Rather than updating the Criminal Code to include deepfakes.... It doesn't do that at all. Deepfakes aren't mentioned in the bill, and folks are not protected from them—not in any criminal way. I believe that's an issue.

Secondly, platforms will be subject to perhaps an assessment and a ruling by an extrajudicial, bureaucratic arm that will be created, but again, there's nothing criminal attached to platforms allowing for the perpetuation of things like deepfakes or other non-consensual images.

Does that not concern you?

June 13th, 2024 / 4:10 p.m.
See context

Keita Szemok-Uto Lawyer, As an Individual

Madam Chair, committee members, thank you for the opportunity to speak before you this afternoon.

By way of brief background, my name is Keita Szemok-Uto. I'm from Vancouver. I was just called to the bar last month. I've been practising, primarily in family law, with also a mix of privacy and workplace law. I attended law school at Dalhousie in Halifax, and while there I took a privacy law course. I chose to write my term paper on the concept of deepfake videos, which we've been discussing today. I was interested in the way that a person could create a deepfake video, specifically a sexual or pornographic one, and how that could violate a person's privacy rights, and in writing that paper I discovered the clear gendered dynamic to the creation and dissemination of these kinds of deepfake videos.

As a case in point, around January this year somebody online made and publicly distributed sexually explicit AI deepfake images of Taylor Swift. They were quickly shared on Twitter, repeatedly viewed—I think one photo was seen as many as 50 million times. In an Associated Press article, a professor at George Washington University in the United States referenced women as “canaries in the coal mine” when it comes to the abuse of artificial intelligence. She is quoted, “It's not just going to be the 14-year-old girl or Taylor Swift. It's going to be politicians. It's going to be world leaders. It's going to be elections.”

Even back before this, in April 2022 it was striking to see the capacity for, essentially, anybody to take photos of somebody's social media, turn them into deepfakes and distribute them widely without, really, any regulation. Again, the targets of these deepfakes, while they can be celebrities or world leaders, oftentimes are people without the kinds of finances or protections of a well-known celebrity. Worst of all, I think, and in writing this paper, I discovered there is really no adequate system of law yet that protects victims from this kind of privacy invasion. I think that's something that really is only now being addressed somewhat with the online harms bill.

I did look at the Criminal Code, section 162, which prohibits the publication, distribution or sale of an intimate image, but the definition of “intimate image” in that section is a video or photo in which a person is nude and the person had a reasonable expectation of privacy when it was made or when the offence was committed. Again, I think the “reasonable expectation of privacy” element will come up a lot in legal conversations about deepfakes. When you take somebody's social media photo, which is taken and posted publicly, it's questionable whether they had a reasonable expectation of privacy when it was taken.

In the paper, I looked at a variety of torts. I thought that if the criminal law can't protect victims, perhaps there is a private course of action in which victims can sue and perhaps get damages or whatnot. I looked at public disclosure of private facts, intrusion upon seclusion and other torts as well, and I just didn't find anything really satisfied the circumstances of a pornographic deepfake scenario—again with the focus of reasonable expectation of privacy not really fitting the bill.

As I understand today, there have been recent proposals for legislation and legislation that are come into force. In British Columbia there's the Intimate Images Protection Act. That was from March 2023. The definition of “intimate image” in that act means a visual recording or visual simultaneous representation of an individual, whether or not they're identifiable and whether or not the image has been altered in any way, in which they're engaging in a sexual act.

The broadening of the definition of “intimate image”, as not just an image of someone who is engaged in a sexual act when the photo is taken but altered to make that representation, seems to be covered in the Intimate Images Protection Act. The drawback of that act is that, while it does provide a private right of action, the damages are limited to $5,000, which seems negligible in the grand scheme of things.

I suppose we'll talk more about Bill C-63 in this discussion, and I do think that it goes in the right direction in some regard. It does put a duty on operators to police and regulate what kind of material is online. Another benefit is that it expands the definitions, again, of the kinds of material that should be taken down.

That act, once passed, will require the operator to take down material that sexually victimizes a child or revictimizes a survivor—

June 13th, 2024 / 3:55 p.m.
See context

Monique St. Germain General Counsel, Canadian Centre for Child Protection Inc.

Thank you for the opportunity today.

My name is Monique St. Germain, and I am general counsel for the Canadian Centre for Child Protection, which is a national charity with the goal of reducing the incidence of missing and sexually exploited children.

We operate cybertip.ca, Canada's national tip line for reporting the online sexual exploitation of children. Cybertip.ca receives and analyzes tips from the public and refers relevant information to police and child welfare as needed. Cybertip averages over 2,500 reports a month. Since inception, over 400,000 reports have been processed.

When cybertip.ca launched in 2002, the Internet was pretty basic, and the rise of social media was still to come. Over the years, technology has rapidly evolved without guardrails and without meaningful government intervention. The faulty construction of the Internet has enabled online predators to not only meet and abuse children online but to do so under the cover of anonymity. It has also enabled the proliferation of child sexual abuse material, CSAM, at a rate not seen before. Victims are caught in an endless cycle of abuse.

Things are getting worse. We have communities of offenders operating openly on the Tor network, also known as the dark web. They share tips and tricks about how to abuse children and how to avoid getting caught. They share deeply personal information about victims. CSAM is openly shared, not only in the dark recesses of the Internet but on websites, file-sharing platforms, forums and chats accessible to anyone with an Internet connection.

Countries have prioritized the arrest and conviction of individual offenders. While that absolutely has to happen, we've not tackled a crucial player: the companies themselves whose products facilitate and amplify the harm. For example, Canada has only one known conviction and sentencing of a company making CSAM available on the Internet. That prosecution took eight years and thousands of dollars to prosecute. Criminal law cannot be the only tool; the problem is just too big.

Recognizing how rapidly CSAM was proliferating on the Internet, in 2017, we launched Project Arachnid. This innovative tool detects where CSAM is being made available publicly on the Internet and then sends a notice to request its removal. Operating at scale, it issues roughly 10,000 requests for removal each day and some days over 20,000. To date, over 40 million notices have been issued to over 1,000 service providers.

Through operating Project Arachnid, we've learned a lot about CSAM distribution, and, through cybertip.ca, we know how children are being targeted, victimized and sextorted on the platforms they use every day. The scale of harm is enormous.

Over the years, the CSAM circulating online has become increasingly disturbing, including elements of sadism, bondage, torture and bestiality. Victims are getting younger, and the abuse is more graphic. CSAM of adolescents is ending up on pornography sites, where it is difficult to remove unless the child comes forward and proves their age. The barriers to removal are endless, yet the upload of this material can happen in a flash, and children are paying the price.

It's no surprise that sexually explicit content harms children. For years, our laws in the off-line world protected them, but we abandoned that with the Internet. We know that everyone is harmed when exposed to CSAM. It can normalize harmful sexual acts, lead to distorted beliefs about the sexual availability of children and increase aggressive behaviour. CSAM fuels fantasies and can result in harm to other children.

In our review of Canadian case law regarding the production of CSAM in this country, 61% of offenders who produced CSAM also collected it.

CSAM is also used to groom children. Nearly half of the victims who responded to our survivor survey of victims of CSAM identified this tactic. Children are unknowingly being recorded by predators during an online interaction, and many are being sextorted thereafter. More sexual violence is occurring among children, and more children are mimicking adult predatory behaviour, bringing them into the criminal justice system.

CSAM is a record of a crime against a child, and its continued availability is ruining lives. Survivors tell us time and time again that the endless trading in their CSAM is a barrier to moving forward. They are living in constant fear of recognition and harassment. This is not right.

The burden of managing Internet harms has fallen largely to parents. This is unrealistic and unfair. We are thrilled to see legislative proposals like Bill C-63 to finally hold industry to account.

Prioritizing the removal of CSAM and intimate imagery is critical to protecting citizens. We welcome measures to mandate safety by design and tools like age verification or assurance technology to keep pornography away from children. We would also like to see increased use of tools like Project Arachnid to enhance removal and prevent the reuploading of CSAM. Also, as others have said, public education is critical. We need all the tools in the tool box.

Thank you.