Protecting Young Persons from Exposure to Pornography Act

An Act to restrict young persons’ online access to sexually explicit material

Status

Report stage (House), as of June 7, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill S-210.

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

This enactment makes it an offence for organizations to make sexually explicit material available to young persons on the Internet. It also enables a designated enforcement authority to take steps to prevent sexually explicit material from being made available to young persons on the Internet in Canada.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

Dec. 13, 2023 Passed 2nd reading of Bill S-210, An Act to restrict young persons’ online access to sexually explicit material

June 11th, 2024 / 6:20 p.m.
See context

Liberal

Patricia Lattanzio Liberal Saint-Léonard—Saint-Michel, QC

My next question is addressed to Ms. Laidlaw.

Bill C-63 was developed to ensure compliance with all existing privacy laws and global best practices. Do you have any concerns related to the privacy implications of Bill S-210? Also, how do we ensure privacy is upheld in the development of online safety regulations?

June 11th, 2024 / 5:30 p.m.
See context

Dr. Jocelyn Monsma Selby Clinical therapist, Researcher Specialising in Forensic Sexology and Addiction, and Chair, Connecting to Protect

Thank you for the opportunity to be here today.

My submission to you comes from 43 years of clinical practice and research and from chairing Connecting to Protect's global summit in 2022, which involved 23 countries addressing harms stemming from children accessing pornography online.

My experience links me directly to the consequences of childhood access to online pornography, which results in problematic sexual behaviour, including difficulties in conducting relationships, destruction of the family and, in more extreme cases, criminal behaviour. Access to pornography by children who are unable to process and understand the material is like a gateway drug, setting up future abuse and all the attendant consequences.

For the last 13 years, I've treated individuals with compulsive sexual behaviour disorder and individuals who've been accessing child sexual exploitation material online. We are facing a global epidemic of online child sexual abuse and exploitation as a result of unregulated access to the Internet. We're getting it wrong and we're missing the mark in protecting children.

My colleague and I have outlined in detail what we consider to be the proposed solution in our brief for Bill S-210. We simply advocate shifting the narrative from the focus on age verification to a broader consideration of age assurance options, in conjunction with device-level controls operating at the point of online access through Google, Apple or Microsoft. This approach is technologically possible and relatively quick to implement, with far greater reach and effectiveness. Device-level controls coupled with a multi-dimensional public health approach are needed, including the implementation of protective legislation and policy.

Sadly, sexual exploitation is happening right now in Canada, feeding the production of illegal sexually explicit material online. Cybertip.ca receives millions of reports of child sexual exploitation material yearly, while 39% of luring attempts reported to Cybertip.ca in the last several years involved victims under 13 years of age. Globally, from 2020 to 2022, WeProtect's global threat assessment—and I hope you're sitting down for this—found a 360% increase in self-generated sexual imagery of seven to 10-year-olds.

How does this happen? It is wrong on so many levels. There is not a child protection expert on the planet who agrees that this is okay. It's child sexual abuse via digital images.

The harms to children due to accessing legal and illegal sexually explicit material online include trauma, exploitation, self-produced sexual images, child-on-child abuse, objectification, violence, risky sexual behaviours, depression, difficulties in forming and maintaining close relationships, anxiety disorder, panic attacks, PTSD and complex PTSD symptoms, among others. Potential health issues and addiction carry on into adulthood, causing documented long-term mental health consequences that impact personal and family relationships and the very fabric of our society, unless there is early identification and treatment of the problem.

You might be wondering how certain individuals are vulnerable to developing a problem like this or a compulsive sexual behaviour disorder. It almost always involves access to legal sexually explicit material online at an early age. The average age of exposure is 12 years old.

I want to talk to you about the erototoxic implications of sexually explicit material online. We know we need to do something—

June 11th, 2024 / 5:10 p.m.
See context

Dr. Emily Laidlaw Associate Professor and Canada Research Chair in Cybersecurity Law, University of Calgary, As an Individual

Thank you for inviting me.

With my time, I'm going to focus on social media regulation and on Bills C-63 and S-210.

Social media has historically been lightly regulated. Online safety has only been addressed if companies felt like it or they were pressured by the market. There have been some innovative solutions, and we need them to continue to innovate, but safety has generally taken a back seat to other interests.

Social media companies have also privately set rules for freedom of expression, privacy and children's rights. There are no minimum standards and no ways to hold companies accountable. That is changing globally. Many jurisdictions have passed online harms legislation. The online harms act, which is part of Bill C-63, aligns with global approaches. In my view, with tweaks, Bill C-63 is the number one avenue to address illegal sexually explicit content and sexual exploitation.

Bill S-210 would mandate age verification to access sites with sexually explicit material. It is a flawed bill, yes, but more importantly, it is unnecessary for two reasons.

First, age verification is the crucial next frontier of online safety, but it is about more than sexually explicit material and about child safety broadly. The technology is evolving, and if we are committed to freedom of expression, privacy and cybersecurity, how this technology is used must be scrutinized closely.

Second, age verification is only one tool in the tool box. A holistic approach is needed whereby safety is considered in product design, content moderation systems and the algorithms. Let me give you a few examples of safety by design that does not involve age verification.

Child luring and sextortion rates are rising. What steps could social media take? Flag unusual friend requests from strangers and people in distant locations. Remove network expansion prompts whereby friends are recommended based on location and interest. Provide easy-to-use complaints mechanisms. Provide user empowerment tools, like blocking accounts.

The non-consensual disclosure of intimate images and child sexual abuse material requires immediate action. Does the social media service offer quick takedown mechanisms? Does it follow through with them? Does it flag synthetic media like deepfakes? How usable are the complaints mechanisms?

For example, Discord has been used to livestream child sexual exploitation content. The Australian e-safety commissioner reported that Discord does not enable in-service reporting of livestreamed abuse. This is an easy fix.

The last example is that the Canadian child protection centre offers a tool to industry, called Project Arachnid, to proactively detect child sexual abuse material. Should social media companies be using this to detect and remove content?

In my view, Bill C-63, again with tweaks, is the best avenue to address sexual exploitation generally. I think the focus should be on how to improve that bill. There are many reasons for that. I'll give two here.

First, the bill imposes three different types of responsibility. Vivek discussed this. Notably, the strongest obligation is the power of the commissioner to order the removal of child sexual abuse content and non-consensual disclosure of intimate images. This recognizes the need for the swift removal of the worst kinds of content.

Second, all of this would be overseen by a digital safety commission, ombudsperson and office. Courts are never going to be fast to resolve the kinds of disputes here, and they're costly. The power of the commissioner to order the removal of the worst forms of content is crucial to providing access to justice.

Courts are just ill-suited to oversee safety by design as well, which is necessarily an iterative process between the commission and companies. The tech evolves, and so do the harm and the solutions.

With my remaining time, I want to flag one challenge before I close, which Vivek mentioned as well. That is private messaging. Bill C-63 does not tackle private messaging. This is a logical decision; otherwise, it opens a can of worms.

Many of the harms explored here happen on private messaging. The key here is not to undermine privacy and cybersecurity protections. One way to bring private messaging into the bill and avoid undermining these protections is to impose safety obligations on the things that surround private messaging. I've mentioned many, such as complaints mechanisms, suspicious friend requests and so on.

Thank you for your time. I welcome questions.

May 30th, 2024 / 10:15 a.m.
See context

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Yes, Chair.

Given how heavy our workload is going to be next week, and given the timelines we're dealing with on Bill S-210, I'm just wondering if we have unanimous consent from this committee to ask for a formal extension so that we can give Bill S-210 proper study, because Bill C-70 is obviously going to take priority in this committee.

Can I get unanimous consent for that?

May 27th, 2024 / 7:10 p.m.
See context

NDP

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you, Mr. Chair.

The testimony today has been interesting. I think it's led the committee down a few different paths as we consider Bill S-210.

Mr. Ripley, we've heard a lot about the potential pitfalls with the term “sexually explicit material” as referenced in the Criminal Code and how it could be overly broad. If we go down further in the bill—still on page three, in proposed subclause 6(2)—it says:

No organization shall be convicted of an offence under section 5 if the act that is alleged to constitute the offence has a legitimate purpose related to science, medicine, education or the arts.

Wouldn't that “legitimate purpose” phrase, from the department's standpoint, save streaming companies like Netflix, since they would be under the arts category? How do you interpret that section?

Does that add further clarity to the concerns you raised about the definition of sexually explicit material?

May 27th, 2024 / 7:10 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Thank you, Mr. Chair.

Mr. Ripley, I want to go back to the definition of sexually explicit material. My colleague Mr. Genuis raised a good point. If there's a concern that nudity or a sex scene in a film or series will be considered sexually explicit material if Bill S‑210 comes into force, it raises questions about the sexually explicit material that already exists. Some movie scenes are already considered as such.

That's why I'm not sure what your concern is.

May 27th, 2024 / 6:55 p.m.
See context

Conservative

Garnett Genuis Conservative Sherwood Park—Fort Saskatchewan, AB

Thank you, Chair.

Thank you to our witnesses.

We're obviously familiar with the Liberal government's position on this bill. With respect to the officials, of course you're in a position to support that position. Your role as an official is not to come here and state your disagreement with government policy, even if you might privately disagree with government policy.

I will just say that I think that many of the arguments you put forward were clearly refuted by the senator already. I also want to say that I think Bill C-63 is a real disaster. It raises actual censorship issues. It has nothing on age verification. It's far, far broader than Bill S-210 at every level. It's enforced by vaguely empowered bureaucratic agencies and it includes dealing with speech.

Most Canadians who have seen what your government did.... To be fair, I understand your role as a non-partisan public servant, tasked with providing fearless advice and faithful implementation. However, what the Liberal government has put forward in Bill C-63 is not being well received across the board.

On the issues with section 171, I'm looking at the Criminal Code and trying to understand the argument here.

We have one definition of sexually explicit material in the Criminal Code. Implicitly, it's being suggested that maybe we could have multiple different definitions of sexually explicit material operating at the same time. However, it seems eminently logical that you would have one definition that relies on the existing jurisprudence.

As Mr. Bittle has suggested that if this definition covers the Game of Thrones, then it's already a problem because it already violates the Criminal Code if, in the commission of another offence, you were to show a child that material. Therefore, you already could run afoul of the Criminal Code if you put on Game of Thrones in your home for your 16-year-old. That's not happening. No one's getting arrested and going to jail because they let their 16-year-old watch Game of Thrones. If that's not happening already off-line, then maybe that suggests that this extensive reinterpretation of what the existing law already says is a little bit exaggerated.

In this context, we also know that Pornhub has been represented by a well-connected Liberal lobbyist who has met with Liberals in the lead-up to the vote.

I want to ask the Privacy Commissioner about what he said in terms of potential amendments.

How would this apply on social media? I'm going to just pose the question. I have young children. I obviously don't want them accessing the major, well-known pornography websites. I also don't want them seeing pornographic material on any other website that they might go to for a legitimate purpose. Therefore, if my children are on social media—they're not—or if they were on another website, if they were watching a YouTube video on that, whatever it was, I would want to ensure that 6-, 7-, 8-, 9-, 10-, 11- and 12-year-olds were not accessing pornography, regardless of the platform and regardless of the percentage of that company's overall business model.

I don't really understand philosophically why it would make sense or protect anyone's privacy to have an exemption for sites where it's just a small part of what they do, because if the point is to protect children, then the point is to protect children wherever they are.

I'd be curious for your response to that.

May 27th, 2024 / 6:45 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

You also have concerns about the protection of privacy and personal information.

Comparisons are often made with Bill C‑63, but in my opinion, the two are quite different. Bill C‑63 aims to protect children from harmful online content, which is commendable. Bill S‑210 seeks to limit access to pornography.

The regulator you want to create through Bill C‑63 seems as though it could be very effective in playing that kind of role. The digital safety commission could play the same role as commissions in other countries. The same goes for the age verification processes.

Can you tell us what concerns you have regarding privacy, as well as any other concerns?

May 27th, 2024 / 6:25 p.m.
See context

Owen Ripley Associate Assistant Deputy Minister, Cultural Affairs, Department of Canadian Heritage

Mr. Chair, thank you for inviting me to discuss Bill S‑210. As the associate assistant deputy minister for cultural affairs at the Department of Canadian Heritage, I will be responsible for the Online Harms Act that is being proposed as part of Bill C‑63.

While Bill C‑63 was being drafted, the department heard directly from experts, survivors from civil society and members of the public on what should be done to combat the proliferation of harmful content online.

A common theme emerged from these consultations: the vulnerability of children online and the need to take proactive measures to protect them. With this in mind, the future online harms act proposes a duty to protect children, which will require platforms to incorporate age-appropriate design features for children. Bill C‑63 also proposes a specialized regulatory authority that will have the skills and expertise to develop regulations, guidance and codes of practice, in consultation with experts and civil society.

Bill S-210 seeks to achieve a similarly admirable goal of protecting children online. However, the bill is highly problematic for a number of reasons, including a scope that is much too broad in terms of regulated services, as well as regulated content; possible risk to Canadians' privacy, especially considering the current state of age-verification frameworks internationally; structural incoherence that seems to mix criminal elements with regulatory elements; a troubling dependence on website blocking as the primary enforcement mechanism; and a lack of clarity around implementation and an unrealistic implementation timeline.

I'll briefly unpack a few of these concerns in greater detail.

As drafted, Bill S-210 would capture a broad range of websites and services that make sexually explicit material available on the Internet for commercial purposes, including search engines, social media platforms, streaming and video-on-demand applications, and Internet service providers. Moreover, the bill's definition of sexually explicit material is not limited to pornography but instead extends to a broader range of mainstream entertainment content with nudity or sex scenes, including content that would be found on services like Netflix, Disney+, or CBC Gem. Mandating age-verification requirements for this scope of services and content would have far-reaching implications for how Canadians access and use the Internet.

While efforts are under way globally in other jurisdictions to develop and prescribe age-verification technologies, there is still a lack of consensus that they are sufficiently accurate and sufficiently privacy-respecting. For example, France and Australia remain concerned that the technology is not yet sufficiently mature, and the testing of various approaches is ongoing. Over the next couple of years, the U.K. will ultimately require age assurance for certain types of services under its Online Safety Act. Ofcom is currently consulting on the principles that should guide the rollout of these technologies. However, the requirement is not yet in force, and services do not yet have to deploy age assurance at scale. In jurisdictions that have already moved ahead, such as certain U.S. states or Germany, there continue to be questions about privacy, effectiveness and overall compliance.

In short, these international examples show that mandates regarding age verification or age assurance are still a work in progress. There is also no other jurisdiction proposing a framework comparable in scope to Bill S-210. Website blocking remains a highly contentious enforcement instrument that poses a range of challenges and could impact Canadians' freedom of speech and Canada's commitment to an open and free Internet and to net neutrality.

I want to state once again that the government remains committed to better protecting children online. However, the government feels that the answer is not to prescribe a specific technology that puts privacy at risk and violates our commitment to an open Internet. It is critical that any measures developed to achieve this goal create a framework for protecting children online that is both flexible and well-informed.

Thank you for your attention. I look forward to any questions you may have.

May 27th, 2024 / 6:20 p.m.
See context

Philippe Dufresne Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Thank you, Mr. Chair.

Thank you to the Members of the Committee for this invitation to appear on your study of Bill S‑210, An Act to restrict young persons’ online access to sexually explicit material.

As Privacy Commissioner of Canada, my mandate is to protect and promote individuals’ fundamental right to privacy. This includes providing advice, guidance, and recommendations for protecting personal information, and overseeing compliance with Canada’s two federal privacy laws—the Privacy Act, which applies to federal government institutions, and the Personal Information Protection and Electronic Documents Act, which is Canada's federal private-sector privacy law.

In January, I launched my Strategic Plan for my Office which is focused on three priority areas: maximizing the OPC’s impact in promoting and protecting the fundamental right to privacy; addressing and advocating for privacy in this time of technological change; and championing the privacy rights of children.

I support the purposes of Bill S-210, which include protecting the mental health of young people from the harmful effects of being exposed to sexually explicit material, but the bill raises some privacy implications, and I would propose some changes to address them.

As drafted, the bill provides that any organization that makes sexually explicit material available online to a young person for commercial purposes is guilty of an offence and liable to a fine that would increase in amount, depending on whether it was a first or subsequent offence. A defence is available if an organization believed the young person was at least 18 years of age, having implemented a prescribed age verification method to limit access to the sexually explicit material.

Age verification can raise privacy implications, as it generally requires the collection of personal information, which could include biometrics or identity documentation. As drafted, the bill would apply to services, such as social media and search engines, that may make available some sexually explicit material, but may be primarily focused on other content. This could result in age verification requirements, including when the majority of content may not be of a sexually explicit nature.

To address this, the committee could consider restricting the requirement for age verification to websites that primarily provide sexually explicit material for commercial purposes.

Before prescribing an age-verification method, the Governor in Council would need to consider certain criteria, including whether the method maintains user privacy and protects users’ personal information. These criteria are important and beneficial.

I would recommend adding additional criteria to the list to ensure that the prescribed methods are sufficiently privacy protective. Specifically, this could include assessing whether the prescribed methods are proportionate and limit the collection of personal information to what is strictly necessary for the verification. Age-verification methods should also prevent tracking or profiling of individuals across visits to websites or services.

Internationally, various jurisdictions have taken action to prevent children from accessing pornography, but some of these laws have a narrower application than Bill S-210. For example, Texas and Utah only require age verification measures on sites that meet a certain threshold of pornographic content. Some regulators have also worked to mitigate the privacy risk associated with the use of age verification technologies. For example, Spanish and French regulators have worked with researchers to develop and evaluate potential age verification mechanisms.

My office is conducting further research in this area and is a member of an international working group with other privacy regulators to share information on age-verification methods and learn from each other’s experiences. Notably, members of this working group intend to publish a joint statement of principles for age assurance later this year. My office is also developing guidance for organizations on age assurance and privacy, and will launch an exploratory consultation on this next month.

Finally, should Bill S-210 be adopted, I would be happy to provide advice on regulations that pertain to privacy and the protection of personal information at the appropriate time. I will be pleased to take your questions.

Thank you.

May 27th, 2024 / 5:30 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

I only have a few seconds left, but I want to hear your thoughts on the fact that, according to the government, we don't need Bill S-210, since there's Bill C-63. To my knowledge, they're not the same at all. Bill C‑63 is extremely important, to be sure, but it's not identical to Bill S‑210. Do you share that opinion?

May 27th, 2024 / 5:25 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

Thank you, Mr. Chair.

Senator, thank you for coming. I'm eager to hear you speak about Bill S-210, an important bill that aims to restrict young people's online access to sexually explicit material. We could also say that it aims to protect young people from learning about sex from online porn.

I liked the way you described the bill when you said that its purpose is to do online what we do offline. Things were much easier when children weren't allowed to go to convenience stores to buy magazines containing pornographic material. Now it's a little more complicated, with everything available online.

You will have noticed though that this bill does not meet with unanimous approval. The government voted against sending it to a parliamentary committee for study. It's thanks to the vote of the other opposition parties that the committee is able to study this bill today. I've read it and I think it's a good bill. In fact, the Bloc Québécois supports it.

Age verification is obviously not a simple matter. We have read about what's being done in other countries. You mentioned Germany, the UK, France, some US states, Spain and Australia, among others. From what I have read about the concrete measures taken by those countries, I note that the law has yet to be applied in many cases, or that it will only be applied in future pilot projects. So we don't necessarily have a clear indication of what is being done and what Canada could do, or examples from which to draw inspiration.

I'd like to hear your comments on a question you raised earlier, and I'll allow you to answer in French: Why did you choose to have the operating provisions in the regulations rather than right in the bill?

What kind of regulations would you like to see the government put in place? Aren't you worried about putting everything in the regulations, given that the government doesn't want anything to do with Bill S-210?

May 27th, 2024 / 5:05 p.m.
See context

Conservative

Garnett Genuis Conservative Sherwood Park—Fort Saskatchewan, AB

Thank you.

Bill S-210 requires that some age verification method be used, but it does not prescribe the method. Is that correct? Why not explicitly define the age verification method in statute?

May 27th, 2024 / 5:05 p.m.
See context

Conservative

Garnett Genuis Conservative Sherwood Park—Fort Saskatchewan, AB

Okay.

Does Bill S-210 create or call for the creation of a digital ID?

May 27th, 2024 / 5 p.m.
See context

Julie Miville-Dechêne Senator, Quebec, ISG

Thank you, Mr. Chair.

Thank you for the invitation to talk about Bill S-210. I'd be happy to answer questions in French, but I'm going to give my speech in English, because that's the language in which most of the criticism has been voiced.

To answer your question directly, let me say the following: This bill has been the subject of two studies at committee, because of the election. We have heard from 24 witnesses and there were 28 briefs. So we can say that this bill has been thoroughly studied at the Senate.

Bill S-210 seeks to apply to online porn the rules that normally apply off-line. The bill does three things. First, it requires websites that offer porn to verify the age of users before they can access that content. Second, it sets up an enforcement mechanism that can result in non-complying websites getting blocked in Canada. Third, it provides that acceptable age verification methods will be decided in regulation, to be adopted after consultation and with input from experts. The bill specifies that any approved method must be reliable, collect information solely for age verification purposes, destroy any personal information once the verification is done and comply with best practices.

Polls indicate that close to 80% of Canadians support age verification to access porn online, but Bill S-210 has been attacked, and I want to correct some misrepresentations.

It has been said that no country has done this. This is false. Germany, France, the U.K., the European Union and several U.S. states have passed laws and regulation to impose age verification to access porn. Spain is expected to launch a pilot project soon. Australia, which had paused this work, announced last week that it would move ahead.

It has been painted as a partisan or ideological bill to control sexuality. False. Age verification is supported by the socialist government of Spain and the conservative government in the U.K. In California, an age verification bill was recently approved unanimously by two legislative committees. This is not partisan legislation.

It has been called an attack on free expression. False again. Bill S-210 would not affect the availability of porn for adults; it would simply prevent children from accessing it. In Europe, porn sites have challenged age verification laws and have failed at every stage. In the U.S.—a country known for its robust speech protection—porn sites have challenged the laws and they have failed all the way to the Supreme Court.

It has been said that age verification would mean submitting personal identification to porn sites. False again. In Europe and elsewhere, age verification is typically done by third party companies using methods that transmit no personal information to porn sites. These are among the best practices we would expect in Canada as well.

It has been said that this bill would block all forms of nudity. False. The bill uses the standard definition of pornography found in the Criminal Code. The bill also provides for usual exceptions for art, science and education.

It has been said that this bill would impose age verification on all websites. False. Bill S-210 only requires age verification to access porn content. If a website contains porn and non-porn content, age verification is only required to access the porn content.

It has been said that there's no way to check someone's age without compromising privacy. False. France is developing a double anonymous method. The U.K. regulator has recommended age estimation approaches that collect no information. Australia has explained its recent decision to move ahead with age verification by saying that the technology it looked at only a year ago has already improved.

Finally, it has been said that age verification is useless because kids will find ways around it. This is once again false. Actual studies show that only a small number of children know how to evade these restrictions. It's possible that some older teenagers and adults will use VPNs to bypass age verification, but it's highly unlikely that large numbers of eight, 10 and 12-year-olds will do so.

I will be happy to take questions, but please, please, don't let Canada be the last place on earth where pornographers are more protected than children.