An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of June 7, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Online Harms ActGovernment Orders

June 7th, 2024 / 1:05 p.m.
See context

Conservative

Cheryl Gallant Conservative Renfrew—Nipissing—Pembroke, ON

Mr. Speaker, the hon. member mentioned the Human Rights Tribunal. Would calling for the elimination of the State of Israel online land someone before the Human Rights Tribunal or would calling for “from the river to the sea”, which refers to the dismantling of Israel or the removal or extermination of its Jewish population, either of those, online, end up landing somebody before the Human Rights Tribunal?

Online Harms ActGovernment Orders

June 7th, 2024 / 1:05 p.m.
See context

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, I think this is why we need to have the rigorous committee process. I know Conservatives will try to throw out lines and ask, “Does this matter? Does this matter?”

With regard to the important aspect of definition, if we just look through part 1 of the bill, it is very clear. As for the definitions that apply, the member knows, as I am sure she read the bill, what definitions apply. In terms of what happens around the Criminal Code, we have concerns about the definitions and we need to be very clear about that.

Conservatives will take that issue of clarity and try to exploit it. I think it is important, as adults in the room, as legislators, as parliamentarians, that we go through that rigorous committee process and that we ensure that questions are answered. I do not believe that the kind of speculation that Conservatives do is helpful at all. Let us get the work done around the bill. It is definitely needed to combat online harms. Let us make sure the definitions are clear and concise.

Online Harms ActGovernment Orders

June 7th, 2024 / 1:05 p.m.
See context

Bloc

Martin Champoux Bloc Drummond, QC

Mr. Speaker, I know that my colleague from New Westminster—Burnaby also cares about regulating what happens on the web. We had the opportunity to work together at the Standing Committee on Canadian Heritage on various topics that have to do with this issue.

We have been waiting for Bill C‑63 for a long time. I think that there is consensus on part 1. As the Bloc Québécois has been saying all day, it is proposing that we split the bill in order to quickly pass part 1, which is one part we all agree on.

The trouble is with part 2 and the subsequent parts. There are a lot of things that deserve to be discussed. There is one in particular that raises a major red flag, as far as I am concerned. It is the idea that a person could file a complaint because they fear that at some point, someone might utter hate speech or commit a crime as described in the clauses of the bill. A complaint could be filed simply on the presumption that a person might commit this type of crime.

To me, that seems to promote a sort of climate of accusation that could lead to paranoia. It makes me think of the movie Minority Report. I am sure my colleague has heard of it. I would like his impressions of this type of thing that we find in Bill C‑63.

Online Harms ActGovernment Orders

June 7th, 2024 / 1:10 p.m.
See context

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, that is why we would like the bill to go to committee for a thorough study, because it is important in the context of this bill.

That said, we know that hate crimes are on the rise. We are seeing more and more anti-Semitism, Islamophobia, racism, misogyny, homophobia, transphobia, and so on. That is why it is important to have clear definitions in the bill.

At this stage of the bill's consideration, we are being asked to vote on the principle of the bill. The bill seeks to reduce online harm, and we agree with that principle. However, there are still many questions and details to be studied. We will have the opportunity to amend the bill in committee to remove certain parts or add others. There is still a lot of work to be done. The NDP wants to refer the bill to committee so that we can begin that work.

Online Harms ActGovernment Orders

June 7th, 2024 / 1:10 p.m.
See context

NDP

Alexandre Boulerice NDP Rosemont—La Petite-Patrie, QC

Mr. Speaker, I thank my NDP colleague from New Westminster—Burnaby for his speech and his involvement in this serious issue.

Unfortunately, we have more proof that the Liberals are dragging their feet and waiting to take action. Online hate is a real problem. Many children and teenagers are experiencing social media in harmful, aggressive and damaging ways. These young people are often the victims of cyberbullying and cyber-attacks, which create very tense situations. The Liberals have not done anything about that.

My colleague is right in saying the Liberals missed something in this bill. The Minister of Justice does not see it. The algorithms are creating echo chambers where people with far-right perspectives, who are racist, homophobic, transphobic and sexist, feed off each other. For example, the phenomenon of fake news is on the rise. The Liberals do not dare touch the issue of secret algorithms.

Why does my colleague think that the Liberals do not dare take that fundamental step in the fight against online hate?

Online Harms ActGovernment Orders

June 7th, 2024 / 1:10 p.m.
See context

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, that is a really great question from my colleague from Rosemont—La Petite-Patrie.

I know that he has done a lot of work to protect children. As a father, it is important for my colleague to ensure that children are not inundated with toxic content that encourages them to self-harm or to commit suicide. It is appalling to see what is out there.

My colleague is right to talk about the Liberals' abject failure. Everyone heard the Prime Minister say in 2021 that he was going to introduce a bill within 100 days to counter all the attacks, the hate crimes and the attacks on children that we are seeing. It took another two years.

Furthermore, the Liberals did not touch on the real profit maker for the web giants: the algorithms. Algorithms rake in incredible profits for these companies. They did not seem to want to look at this key element, and we can speculate as to why. However, we want to get answers to this question, and that is something we are going to do in committee.

Online Harms ActGovernment Orders

June 7th, 2024 / 1:10 p.m.
See context

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Mr. Speaker, it is a pleasure to be able to rise and speak to Bill C-63.

We often talk about the communities and neighbourhoods in which we live. We do this not only as parliamentarians but also as politicians in general, whether at the municipal, provincial, or federal level. We talk about how we want people to feel safe. People need to feel safe in their homes, in their communities and in the places where they live. That has always been a priority for the current government and, I would like to think, for all parliamentarians of all political stripes. However, sometimes we need to look at finding a better definition of what we mean when we talk about keeping people safe in our communities.

The Internet is a wonderful thing, and it plays a critical and important role in society today. In fact, I would argue that, nowadays, it is an essential service that is virtually required in all communities. We see provincial and national governments investing greatly to ensure that there is more access to the Internet. We have become more and more dependent on it in so many different ways. It is, for all intents and purposes, a part of the community.

I could go back to the days when I was a child, and my parents would tell me to go outside and play. Yes, I would include my children as having been encouraged to go outside and play. Then things such as Nintendo came out, and people started gravitating toward the TV and playing computer games. I have grandchildren now, and I get the opportunity to see my two grandsons quite a bit. I can tell members that, when I do, I am totally amazed at what they are participating in on the Internet and with respect to technology. There are incredible programs associated with it, from gaming to YouTube, that I would suggest are a part of the community. Therefore, when we say that we want to protect our children in our communities when they are outside, we also need to protect them when they are inside.

It is easy for mega platforms to say it is not their responsibility but that of the parent or guardian. From my perspective, that is a cop-out. We have a responsibility here, and we need to recognize that responsibility. That is what Bill C-63 is all about.

Some people will talk about freedom of speech and so forth. I am all for freedom of speech. In fact, I just got an email from a constituent who is quite upset about how the profanity and flags being displayed by a particular vehicle that is driving around is promoting all sorts of nastiness in the community. I indicated to them that freedom of speech entitles that individual to do that.

I care deeply about the fact that we, as a political party, brought in the Charter of Rights and Freedoms, which guarantees freedom of speech and expression. At the end of the day, I will always advocate for freedom of speech, but there are limitations. I believe that, if we look at Bill C-63, we can get a better sense of the types of limitations the government is talking about. Not only that, but I believe they are a reflection of a lot of the work that has been put together in order to bring the legislation before us today.

I understand some of the comments that have been brought forward, depending on which political parties addressed the bill so far. However, the minister himself has reinforced that this is not something that was done on a napkin; it is something that has taken a great deal of time, effort and resources to make sure that we got it right. The minister was very clear about the consultations that were done, the research that took a look at what has been done in other countries, and what is being said here in our communities. There are a great number of people who have been engaged in the legislation. I suspect that once it gets to committee we will continue to hear a wide spectrum of opinions and thoughts on it.

I do not believe that as legislators we should be put off to such a degree that we do not take action. I am inclined to agree with the minister in saying that this is a holistic approach at dealing with an important issue. We should not be looking at ways to divide the legislation. Rather, we should be looking at ways it can be improved. The minister himself, earlier today, said that if members have ideas or amendments they believe will give more strength to the legislation, then let us hear them. Bring them forward.

Often there is a great deal of debate on something at second reading and not as much at third reading. I suggest that the legislation before us might be the type of legislation that it would be beneficial to pass relatively quickly out of second reading, after some members have had the opportunity to provide some thoughts, in favour of having more reading or debate time at third reading but more specifically to allow for time at the committee stage. That would allow, for example, members the opportunity to have discussions with constituents over the summer, knowing full well that the bill is at committee. I think there is a great deal of merit to that.

There was something that spoke volumes, in terms of keeping the community safe, and the impact today that the Internet has on our children in particular. Platforms have a responsibility, and we have to ensure that they are living up to that responsibility.

I want to speak about Carol Todd, the mother of Amanda Todd, to whom reference has been made already. Ultimately, I believe, she is one of the primary reasons why the legislation is so critically important. Amanda Michelle Todd was born November 27, 1996, and passed away October 10, 2012. Colleagues can do the math. She was a 15-year-old Canadian student and a victim of cyber-bullying who hanged herself at her home in Port Coquitlam, British Columbia. There is a great deal of information on the Internet about to Amanda. I thank her mother, Carol, for having the courage to share the story of her daughter, because it is quite tragic.

I think there is a lot of blame that can be passed around, whether it is to the government, the private sector or society, including individuals. Carol Todd made reference to the thought that her daughter Amanda might still actually be alive if, in fact, Bill C-63 had been law at the time. She said, “As a mom, and having gone through the story that I've gone through with Amanda, this needs to be bipartisan. All parties in the House of Commons need to look in their hearts and look at young Canadians. Our job is to protect them. And parents, we can't do it alone. The government has to step in and that's what we are calling for.”

That is a personal appeal, and it is not that often I will bring up a personal appeal of this nature. I thought it was warranted because I believe it really amplifies and humanizes why this legislation is so important. Some members, as we have seen in the debate already, have indicated that they disagree with certain aspects of the legislation, and that is fine. I can appreciate that there will be diverse opinions on this legislation. However, let us not use that as a way to ultimately prevent the legislation from moving forward.

Years of consultation and work have been put into the legislation to get it to where it is today. I would suggest, given we all have had discussions related to these types of issues, during private members' bills or with constituents, we understand the importance of freedom of speech. We know why we have the Charter of Rights. We understand the basics of hate crime and we all, I believe, acknowledge that freedom of speech does have some limitations to it.

I would like to talk about some of the things we should think about, in terms of responsibilities, when we think about platforms. I want to focus on platforms in my last three minutes. Platforms have a responsibility to be responsible. It is not all about profit. There is a societal responsibility that platforms have, and if they are not prepared to take it upon themselves to be responsible, then the government does need to take more actions.

Platforms need to understand and appreciate that there are certain aspects of society, and here we are talking about children, that need to be protected. Platforms cannot pass the buck on to parents and guardians. Yes, parents and guardians have the primary responsibility, but the Internet never shuts down. Even parents and guardians have limitations. Platforms need to recognize that they also have a responsibility to protect children.

Sexually victimized children, and intimate content that is shared without consent, are the types of things platforms have to do due diligence on. When the issue is raised to platforms, there is a moral and, with the passage of this legislation, a legal obligation for them to take action. I am surprised it has taken this type of legislation to hit that point home. At the end of the day, whether a life is lost, people being bullied, or depression and mental issues are caused because of things of that nature, platforms have to take responsibility.

There are other aspects that we need to be very much aware of. Inciting violent extremism or terrorism needs to be flagged. Content that induces a child to harm themselves also needs to be flagged. As it has been pointed out, this legislation would have a real, positive, profound impact, and it would not have to take away one's freedom of speech. It does not apply to private conversations or communications.

I will leave it at that and continue at a later date.