Digital Charter Implementation Act, 2020

An Act to enact the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act and to make consequential and related amendments to other Acts

This bill was last introduced in the 43rd Parliament, 2nd Session, which ended in August 2021.

Sponsor

Navdeep Bains  Liberal

Status

Second reading (House), as of April 19, 2021
(This bill did not become law.)

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

Part 1 enacts the Consumer Privacy Protection Act to protect the personal information of individuals while recognizing the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act. It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act, which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

June 7th, 2021 / 12:30 p.m.
See context

Juris Doctor Candidate and Advocate and Cybersecurity Researcher, As an Individual

Melissa Lukings

I think the current issue is that perhaps the penalties that currently exist in PIPEDA are not strong enough to deter corporations. I'm not saying to put in new regulations—I'm not saying that—but when you're going to do the digital charter implementation act and you're discussing things like Bill C-10 and Bill C-11, it's important to remember that.

I think there is room for improvement. Because we've found that financial penalties don't really seem to impact companies that make a lot of money, fines could instead be based on percentages. The key here is that we need to not have increased regulation. If what we're trying to do is in fact what we say we're trying to do, which is to reduce human trafficking and harm to young people, additional regulations are not going to help that.

Did I answer your question?

May 18th, 2021 / 3:40 p.m.
See context

Green

Paul Manly Green Nanaimo—Ladysmith, BC

Thank you, Mr. Chair.

I want to see this bill get moving along, as well. I was surprised that when we hit clause 3, proposed section 4.1, there was a Conservative Party amendment to it, and after the amendment failed, there was a move to remove section 4.1. There was no debate. There was no call for a recorded vote. We didn't deal with that section at that time, and we're stuck in this ongoing filibuster.

The minister has given an explanation. We should get through the rest of these amendments. There is a stage at report stage where section 4.1 can be added back in. If enough parliamentarians think that it's an important thing to have added, then that's what we should do.

I believe in freedom of speech as much as the next person, but I find that the whole system of algorithms with these private platforms doesn't really lend to freedom of speech at all. I get countless emails from constituents who say that there is no freedom of speech on Google, YouTube or Facebook, and that their comments are being blocked or that things are being blocked, so that's another issue we need to deal with.

We're dealing with private platforms that are censoring people, and determining what gets bumped up and what gets bumped down. It's mostly for commercial interests and advertising, and to inflame people, to weaponize our anger at each other. I think we need to look at this.

We're coming up to Bill C-11 where we're going to be talking about these things, but we should get this Broadcasting Act done. If there's an amendment at report stage to fix and bring back section 4.1, that would be the time to do it. Let's get the rest of the amendments through.

May 17th, 2021 / 3:30 p.m.
See context

Canada Research Chair in Internet and E-Commerce Law, Faculty of Law, University of Ottawa, As an Individual

Dr. Michael Geist

I appreciate the question.

First, for those who are not aware, net neutrality speaks to the need to treat all content in an equal fashion, regardless of source or destination. That's been a core principle, I thought, of successive governments, although it seemed like the heritage minister expressed some doubt on it, at least in one media interview around that issue.

Quite frankly, we just heard from Professor Trudel. He said that algorithms determine the type of content that is visible. That speaks exactly to the concerns around net neutrality and the notion that an algorithm can in fact undermine those net neutrality principles.

If it is being done at the behest of a government, which is precisely what is being proposed under this bill, the CRTC will be making those determinations. That is where the speech implications and the concerns from a net neutrality perspective arise. That is, I repeat, precisely why no country in the world does this. Nobody thinks it is appropriate to have a government make these kinds of choices about what gets prioritized or not prioritized with respect to content.

The algorithmic transparency that Professor Trudel mentioned is something entirely separate. In fact, it is something that is absolutely necessary from a regulatory perspective and is even included in Bill C-11, which the government, for whatever reason, has largely buried and hasn't moved forward.

It's not about whether we regulate algorithms; it's about whether the CRTC and the government use those algorithms to determine or prioritize or de-prioritize what we can see.

May 17th, 2021 / 2:35 p.m.
See context

Dr. Michael Geist Canada Research Chair in Internet and E-Commerce Law, Faculty of Law, University of Ottawa, As an Individual

Thank you very much, Mr. Chair.

As you know, my name is Michael Geist. I appear in a personal capacity, representing only my own views. I always start with that statement, but it feels particularly necessary in this instance, given the misinformation and conspiracy theories that some have floated and that Minister Guilbeault has disappointingly retweeted.

As I am sure you are aware, I have been quite critical of Bill C-10. I would like to reiterate that criticism of the bill is not criticism of public support for culture or of regulation of technology companies. I think public support for culture is needed, and I think there are ways to ensure money for creator programs this year and not in five years, as in this bill.

Further, I am puzzled and discouraged by the lack of interest in Bill C-11, which would move toward modernizing Canada’s privacy rules to help address concerns about how these companies collect and use our data. The bill would also mandate algorithmic transparency, which is much needed and far different from government-mandated algorithmic outcomes.

I’ll confine my opening remarks to the charter-related questions and widespread concerns about the regulation of user-generated content, but would welcome questions on any aspect of the bill.

There is simply no debating that following the removal of proposed section 4.1, the bill now applies to user-generated content, since all audiovisual content is treated as a program under the act. You have heard experts say that and department officials say that. The attempts to deflect from that simple reality by pointing to proposed section 2.1 to argue that users are not regulated is deceptive and does not speak to the issue of regulating the content of users.

I will speak to the freedom of expression implications in a moment, but I want to pause to note that no one, literally no other country, uses broadcast regulation to regulate user-generated content in this way. There are good reasons that all other countries reject this approach. It is not that they don’t love their creators and want to avoid regulating Internet companies; it is that regulating user-generated content in this manner is entirely unworkable, a risk to net neutrality and a threat to freedom of expression. For example, the European Union, which is not shy about regulation, distinguishes between streaming services such as Netflix and video-sharing services such as TikTok or YouTube, with no equivalent regulations such as those found in Bill C-10 for user-generated content.

From a charter perspective, the statement issued by the Department of Justice last week simply does not contain analysis or discussion about how the regulation of user-generated content as a program intersects with the charter. There is similarly no discussion about whether this might constitute a violation that could be justified, no discussion on the implications of deprioritizing speech, no discussion on the use of terms such as “social media service” that are not even defined in the bill, and no discussion of the implementation issues that could require Canadians to disclose personal location-based information in order to comply with the new, ill-defined requirements.

In my view, the prioritization or deprioritization of speech by the government through the CRTC necessarily implicates freedom of expression. The charter statement should have acknowledged this reality and grappled with the question of whether it is saved by section 1. I do not believe it is.

First, the bill as drafted, with section 4.1 in it, was the attempt to minimally impair those speech rights. With it removed, the bill no longer does so.

Second, the discoverability policy objective is not enough to save the impairment of free speech rights. There is no evidence that there is a discoverability problem with user-generated content.

Ms. Yale’s panel, which notably appears to have lost its unanimity, recommended discoverability but cited no relevant evidence to support claims that there is an issue with user-generated content.

Third, the objective of making YouTube pay some additional amount to support music creation is not enough to save the impairment of free speech rights either. This isn’t about compensation, because the works are already licensed. This is about paying some additional fees, given concerns that section 4.1 would have broadly exempted YouTube. I am not convinced that was the case, as services such as YouTube Music Premium might well have been captured. I am not alone on that. Canadian Heritage officials thought so too in a memo they wrote to the minister. In fact, it was such a non-issue that Mr. Cash’s organization did not even specifically cite the provision or raise the issue in the brief that it submitted to this committee.

I find it remarkable that the minister and the charter statement effectively tell Canadians that they should trust the CRTC to appropriately address free speech rights but are unwilling to do the same with respect to how section 4.1 would be interpreted.

Let me conclude by noting that if a choice must be made between some additional payments by a streaming service and regulating the free speech rights of Canadians, I would have thought that standing behind freedom of expression would be an easy choice to make, and I have been genuinely shaken to find that my government thinks otherwise.

I look forward to your questions.

PrivacyOral Questions

May 11th, 2021 / 2:45 p.m.
See context

NDP

Charlie Angus NDP Timmins—James Bay, ON

Mr. Speaker, Privacy Commissioner Daniel Therrien is raising serious alarm bells that Bill C-11 would undermine the fundamental privacy rights of Canadians. As a case in point, Clearview AI broke Canadian law when it took millions of photos of Canadians without their consent for its controversial facial recognition technology. The Privacy Commissioner is saying that Bill C-11 would actually protect the interests of companies like Clearview over the rights of Canadians.

Why are the Liberals using Bill C-11 to rewrite the privacy laws and stack the deck in favour of corporate outliers such as Clearview over protecting the rights of Canadian citizens?

May 10th, 2021 / 12:55 p.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

That's my concern, as well.

Indeed, I think it is quite possible that a court, seized with a matter like Clearview AI, under CPPA, would not necessarily maintain the decision that we have made, in part because of the way the balancing clause of the CPPA is drafted. I find that extremely concerning, as well as the limited nature of administrative penalties under Bill C-11.

May 10th, 2021 / 12:55 p.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

First, I would not ascribe motivation to those who have tabled Bill C-11, other than trying to balance commercial interests and privacy concerns and issues, and—

May 10th, 2021 / 12:55 p.m.
See context

NDP

Charlie Angus NDP Timmins—James Bay, ON

Thank you, Mr. Chair.

Mr. Therrien, one thing I thought was really profound in your findings against Clearview AI was that you said it would essentially subject the citizens of this country to a perpetual police lineup.

What we're talking about is not dystopian science fiction. We should know, as citizens, that when our children go to the mall, they aren't being photographed and put into a database; that racialized citizens are not being targeted on the streets where they walk; and that the right to go into a public place is a public right and we should not be profiled, targeted or put into some form of database for collection.

The Clearview AI case was a really good opportunity for Canada to get this right, because it was so egregious. What you're telling us is that the laws were written, in a way, to protect these outlier companies, ignoring the growing awareness that's happening internationally.

With Bill C-11, if the government is refusing to make the necessary changes to put a human rights frame on the rights of privacy, and if it is going to insist on protecting the interests of corporations that may not have the best interests of our citizens at heart, would we be better off with the status quo than putting more weight on the side of companies and outliers like Clearview AI?

May 10th, 2021 / 12:50 p.m.
See context

Bloc

Rhéal Fortin Bloc Rivière-du-Nord, QC

As I see it, Bill C-11, a pivotal piece of privacy legislation, is incomplete or ill-considered. We'll see. Quite a few deficiencies could have been avoided had you been involved in the legislative process from the outset. Wouldn't you say?

May 10th, 2021 / 12:45 p.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

Indeed, that's another relationship that's very important. I think we have a good relationship with the Competition Bureau. Again, as I said earlier, in the virtual world as in the physical world, it's normal to have a number of regulatory agencies that share different activities from different perspectives.

It's good to have both the Competition Bureau and the Office of the Privacy Commissioner. The important thing is to ensure that the law allows a certain sharing of information between these agencies, so that we can benefit from our respective expertise and also, from an operational perspective, we can divide files according to who's best placed to handle them.

At the general level, we need to be able to co-operate with other regulators, including the Competition Bureau. There are provisions in Bill C-11 to facilitate that, and that's a good thing. We look forward to further co-operation with the Competition Bureau and others.

May 10th, 2021 / 12:45 p.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

The government certainly cited the desirability of Canada maintaining adequacy status in the EU as one impetus for Bill C-11. Indeed, maintaining adequacy is important. It allows data flows between Canada and the EU without specific mechanisms, like special contracts and the like.

Clearly, for Canada maintaining adequacy is helpful in order to maintain a freer flow of data between Canada and the EU. Beyond the EU, as I've said, we live in an interconnected world, and obviously, we have a neighbour to the south with whom we have very significant fundamental commercial relations, so data also needs to flow there.

I think that's all good, but we need to.... Hopefully, in the context of the review of Bill C-11, we can look at ways to allow these data flows, but in a way that recognizes that when data leaves Canada, the risks are higher.

I'm not advocating for ways to prevent these data flows, but certainly, in the submission you will now be able to read, we make certain recommendations on how to enhance the protection of personal information when it does leave Canada, while still allowing that.

May 10th, 2021 / 12:45 p.m.
See context

Liberal

Patricia Lattanzio Liberal Saint-Léonard—Saint-Michel, QC

Thank you, Mr. Chair.

Thank you, Mr. Therrien, for your testimony this morning. It was quite informative.

What I'm drawing from it is that there's a constant need of striking a balance between individual human rights, public confidence and economic growth. It's going to be quite a difficult task, because technology is forever evolving and it's going at a very fast pace. In my opinion, a restudy is more than warranted as we do not know when we will get Bill C-11.

On the question of cross-border data, that's of interest to me because given the nature of cross-border data, as it flows, it adheres to international best practices and standards, which will be instrumental for ensuring Canadian competitiveness.

Is it correct to say—and I want to go back to that European notion you were talking about earlier—that the EU data protection regulation remains the international gold standard? How can Canada ensure equivalency with this regulation? That would be my first question.

Why is it in Canada's interests to retain the equivalency with the EU?

May 10th, 2021 / 12:45 p.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

It's pretty clear that that type of use is unacceptable, even under the current privacy legislation. It's unacceptable. Penalties would be the answer in this case.

Deterring that kind of behaviour would require significant penalties, and neither the current act nor Bill C-11 sets out such penalties.

May 10th, 2021 / 12:40 p.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

We've seen an example of this with Clearview AI. To socialize with friends and family, users innocently use social media with no idea that the information they provide, including their photos, may be collated by a company like Clearview AI, which uses the data for so-called police investigations or, as mentioned, to conduct private investigations of individuals.

You mentioned that the presence of surveillance cameras in some public places also poses a significant risk. I would add, again, that facial recognition can play an important role, particularly in providing security in relation to certain events. The use of facial recognition in public places is a sensitive matter, but I wouldn't say it should be banned altogether.

I strongly encourage you to ask other witnesses to come where they think the problems lie. For my part, I would answer that it is in several places. I don't think you can regulate the whole situation. You have to look at it from a values perspective, and that again brings me back to the question of anchoring legislation in a human rights framework. This is more apparent in the case of the Department of Justice proposals than in the case of Bill C-11. Values are important. Respect for human rights is important. Second, there should be mechanisms to balance commercial interests and human rights, and these mechanisms should be better than those in Bill C-11. We will forward our recommendations to you in this regard.

I would add as a final point that right now our laws in Canada and in many countries—it's not the case everywhere—are said to be technology neutral. That means that the principles apply equally across the board, regardless of the type of technology, including biometrics and facial recognition. There are great advantages to this, and I am not suggesting that this aspect of our laws should be set aside. I think one of the things that you should be looking at is—and your question is very relevant to this—whether there is a need to circumscribe facial recognition activities. This would mean either prohibiting them or subjecting some of them to particularly strict regulation. In this regard, I refer you to a draft regulation on artificial intelligence, published in April by the European Commission. In it, certain prohibited practices are defined, including the use of live facial recognition in certain public places, except for exceptional cases, such as the investigation of major crimes or acts of terrorism.

This is a mixture of general principles about how to balance commercial or governmental interests and human rights on the one hand, and laws of general application on the other. In my view, we need to ask ourselves if there is a case to be made for some specific rules that would either prohibit or strictly regulate this technology; it presents particular risks, because biometric data is permanent.

May 10th, 2021 / 12:35 p.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

I think very significant amendments to Bill C-11 should be made to adequately protect privacy.