Digital Charter Implementation Act, 2020

An Act to enact the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act and to make consequential and related amendments to other Acts

This bill was last introduced in the 43rd Parliament, 2nd Session, which ended in August 2021.

Sponsor

Navdeep Bains  Liberal

Status

Second reading (House), as of April 19, 2021
(This bill did not become law.)

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

Part 1 enacts the Consumer Privacy Protection Act to protect the personal information of individuals while recognizing the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act. It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act, which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

May 10th, 2021 / 12:30 p.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

Two main mechanisms are relevant to Clearview's situation under CPPA.

The first one is the purpose clause—proposed section 5—of the CPPA, which confirms the PIPEDA's approach to balance commercial interests with privacy considerations. That clause does not say that privacy is a human right. That clause adds a number of commercial factors compared to the current law. There would be a balancing exercise, with the likelihood of greater weight given to commercial factors than under the current PIPEDA. That's point one.

Point two is that assuming it would be inconsistent with the CPPA for Clearview to do what they did, there's an administrative penalty scheme under Bill C-11 and a criminal penalty scheme under Bill C-11. The administrative penalty scheme is limited to an extremely narrow slice of violations of the CPPA. These violations have to do with a form of consent with the understanding requirement that I referred to before—with whether Clearview had the right balance between commercial interests and human rights. All of that cannot be the subject of administrative penalties under the CPPA.

In order for penalties to apply, the office would have to first make a finding, which would take about two years. Secondly, they would make an order. The penalty would be excluded. The tribunal would sit in appeal of our order, assuming the company would still not comply with the order. If the company would not comply with an order several years after it has been made, then it would be the subject of criminal penalties and the criminal courts would be involved.

The process that leads to penalties is very protracted. We think it's something like seven years after the fact, as opposed to what should be happening, which is that we should be able to impose penalties—of course subject to court review for fairness considerations vis-à-vis companies. We think the delay would be roughly two years in that model compared with the model in Bill C-11.

May 10th, 2021 / 12:30 p.m.
See context

NDP

Charlie Angus NDP Timmins—James Bay, ON

Thank you very much.

Mr. Therrien, when we first learned of the Clearview AI case, it seemed to be the worst possible scenario. Here we had this company that scraped millions of photos of Canadians without their consent—our kids' birthday parties, our backyard barbecues, us at work—and then created a database that they were selling to all manner of organizations.

They claim it was for police, but we know that individual police officers had it without oversight. We know that a billionaire, John Catsimatidis, used it to target his daughter's boyfriend. You launched an investigation. Clearview AI's attitude was “Too bad, so sad. You're just Canadians and we don't even feel obligated to follow the law.”

We had a new law, Bill C-11, come in. My understanding, my gut feeling, was that Bill C-11 would fix these things so that we would have more powers and we'd be able to target these companies to make them respect the law. Are you telling us that under Bill C-11 the weight of support would actually go to rogue outliers like Clearview AI over the rights of citizens?

Are you saying that, on the monetary penalties we've been told about that would ensure compliance, a company like Clearview AI would be completely exempt from that? Is that what we're seeing under this new law?

May 10th, 2021 / 12:30 p.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

It depends on the country. In Europe and elsewhere, such as in some Latin American countries, Japan, and, if I am not mistaken, South Korea, the approach we suggest exists, which is to have the protective provisions enforced within a human rights framework.

Then there are considerable penalties so that consumers can have confidence that their data is being handled with respect for their privacy. As one of the committee members said earlier, many companies are acting in a compliant manner, but some really need incentives. So there need to be significant penalties, and there are penalties in their legislation.

I would remind you that failures like Clearview AI's would not be subject to administrative penalties under the provisions of Bill C-11, which is rather hard to understand.

May 10th, 2021 / 12:25 p.m.
See context

Bloc

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Mr. Chair.

Mr. Therrien, the more I listen to you, the more I realize that studying Bill C-11 is quite a chore. The privacy situation is really concerning. It's something that everyone is concerned about, here in Quebec at least, and I'm sure it's the same in the rest of Canada, if not the entire planet.

I'm a little concerned about what you're telling us with respect to Bill C-11, which might not cover all the angles, some of which would be quite important. I note, among other things, your caveat about facial recognition data being immutable. Once we have that data, it will be there for life. I also note the issue of exchanges between countries, where we must be even more careful, because the protections are not the same in all countries. In this day and age, with more and more trade between countries, I guess you have to be more and more careful, and put more time and effort into it. Those are some of the concerns we have.

When Bill C-11 was being developed, did you intervene? Was the Conflict of Interest and Ethics Commissioner called in to advise the minister, and did he try to include the various safeguards that you feel are missing from the current version of the bill? Have you prepared a report or other document?

May 10th, 2021 / 12:20 p.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

Bill C-11, as mentioned, does not have a human rights approach to the privacy law in question. It would be very beneficial if the proposed CPPA had a human rights foundation because then the principle of accuracy that I just alluded to could be used to ensure that potential discrimination against populations in the use of facial recognition would be part of our remit to ensure that, under privacy principles, technology that would result in discrimination would be found contrary to privacy.

I'll say that some would argue that these issues should be addressed through human rights legislation. Certainly, that's a credible point. I would say that, in the virtual world as in the physical world, the fact that there is some overlap in the jurisdiction of regulators here, as between my office and the Canadian Human Rights Commission, is not a bad thing as long as the regulators speak to one another, are efficient and benefit from each other's expertise. Our model would be to have a human rights approach to privacy law.

May 10th, 2021 / 12:20 p.m.
See context

Liberal

Han Dong Liberal Don Valley North, ON

Without any empirical evidence that facial recognition technology could be harmful, particularly to a racialized community, is there anything in Bill C-11 that we can do to provide a guardrail to make sure that the vulnerable communities don't get harmed as facial recognition technology develops?

May 10th, 2021 / 12:10 p.m.
See context

Conservative

Colin Carrie Conservative Oshawa, ON

Thank you very much, Mr. Chair.

Monsieur Therrien, I want to thank you for your wisdom. With Bill C-11 coming down the pipe, it's so important that we lean one way versus the other way.

I know with facial recognition, when you first see it, it's so cool. We all heard about the issue with Cadillac Fairview, the shopping mall issue. Maybe we'll get to that today, but even sites like Facebook, they have these tag suggestions and they insert them as default settings. Theses sites are collecting our data, our faces, and many times people are totally unaware of it. That's where I want to start our conversation today.

I come from Oshawa. Oshawa is one of those communities that historically built cars and sent people back and forth across the border, things along those lines. I want to talk to you a little bit about the international utilization of facial recognition. I've heard that border efficiencies could be improved. I was wondering if you could comment on the opportunity, perhaps, for these opt-in, opt-out options if we're going back and forth across borders for business or as individuals.

Are there any international conversations about the right to delete and destroy information that may be gathered from Canadians as they cross borders into other countries?

May 10th, 2021 / noon
See context

Conservative

The Chair Conservative Chris Warkentin

I'm calling this meeting back to order.

For the second hour of this meeting, we're launching our study on facial recognition software and concerns related to it. Today we have the commissioner, who has agreed to remain here for an additional hour so that he can answer some questions as we launch into the investigation of this matter.

Thank you, Commissioner, for remaining with us.

We also have Mr. Homan, who is remaining with us as well, and Lara Ives, who is the executive director of the policy, research and parliamentary affairs directorate. Thank you so much for being here. Finally, we have Regan Morris, who is joining us as legal counsel.

Thank you as well for being here.

Commissioner, I'll turn it back to you for an opening statement to allow you to begin the discussion. Then we'll have questions for you.

Thank you again, Mr. Chair.

Facial recognition technology has become an extremely powerful tool that, as we saw in the case involving Clearview AI, can identify a person in a bank of billions of photos or even among thousands of protesters. If used responsibly and in appropriate situations, it can provide significant benefits to society.

In law enforcement, for example, it can enable police to solve crimes or find missing persons. However, it requires the collection of sensitive personal information that is unique to each individual and permanent in nature. Facial recognition technology can be extremely privacy invasive. In addition to promoting widespread surveillance, it can produce biased results and undermine other human rights.

The recent Clearview AI investigation, conducted jointly with my counterparts in three provinces, demonstrated how facial recognition technology can lead to mass surveillance and help treat billions of innocent people as potential suspects. Despite our findings that Clearview AI's activities violated Canadian privacy laws, the company refused to follow our recommendations, such as destroying the photos of Canadians.

In addition, our office is currently investigating the Royal Canadian Mounted Police, or RCMP, use of Clearview AI technology. This investigation is nearing completion. We are also working with our colleagues in all provinces and territories to develop a guidance document on the use of facial recognition by police forces. We expect to release a draft of this document for consultation in the coming weeks.

The Clearview case demonstrates how citizens are vulnerable to mass surveillance facilitated by the use of facial recognition technology. This is not the kind of society we want to live in. The freedom to live and develop free from surveillance is a fundamental human right. Individuals do not forgo their rights merely by participating in the world in ways that may reveal their face to others or enable their image to be captured on camera.

The right to privacy is a prior condition to the exercise of other rights in our society. Poorly regulated uses of facial recognition technology, therefore, not only pose serious risks to privacy rights but also impact the ability to exercise such other rights as freedom of expression and association, equality and democracy. We must ensure that our laws are up to par and that they impose limits to ensure respect for fundamental rights when this technology is used.

To effectively regulate facial recognition technologies, we need stronger protections in our privacy laws, including, among other things, a rights-based approach to privacy, meaningful accountability measures and stronger enforcement powers. The federal government recently introduced two proposals to modernize our privacy laws. These are important opportunities to better regulate the use of facial recognition and other new technologies.

Last November, the Department of Justice released a comprehensive and promising consultation paper that outlined numerous proposals to improve privacy legislation in the federal public sector. It proposes enhanced accountability requirements and measures aimed at providing meaningful oversight and quick and effective remedies. It also proposes a stronger collection threshold, which would require institutions to consider a number of factors to determine if the collection of personal information is “reasonably required” to achieve a specific purpose, such as ensuring that the expected benefits are balanced against the privacy intrusiveness, so that collection is fair, not arbitrary and proportionate in scope.

In the private sector, Bill C-11 would introduce the consumer privacy protection act. In my view, as I stated in the last hearing, that bill requires significant amendments to reduce the risks of facial recognition technology. The significant risks posed by facial recognition technology make it abundantly clear that the rights and values of citizens must be protected by a strong, rights-based legislative framework. The Department of Justice proposes adding a purpose clause to the Privacy Act that specifies that one of the key objectives of the legislation is “protecting individuals' human dignity, personal autonomy, and self-determination”, recognizing the broad scope of the right to privacy as a human right.

Conversely, Bill C-11 maintains that privacy and commercial interests are competing interests that must be balanced. In fact, compared with the current law in the private sector, PIPEDA, the bill gives more weight to commercial interests by adding new commercial factors to be considered in the balance without adding any reference to the lessons of the past 20 years on technology's disruption of rights.

Clearview was able to rely on the language of the current federal act, PIPEDA, to argue that its purposes were appropriate and the balance should favour the company's interests rather than privacy. Although we rejected these arguments in our decision, some legal commentators have suggested that our findings would be a way to circumvent PIPEDA's purpose clause by not giving sufficient weight to commercial interests. Even though we found that Clearview breached PIPEDA, a number of commentators, including the company but not limited to the company, are saying that we actually misapplied the current purpose clause.

If Bill C-11 were passed in its current form, Clearview and these commentators could still make these arguments, buttressed by a purpose clause that gives more weight to commercial factors. I urge you to make clear in Bill C-11 that where there is a conflict between commercial objectives and privacy protection, Canadians' privacy rights should prevail. Our submission analyzing this bill makes specific recommendations on the text that would achieve this goal.

Demonstrable accountability measures are another fundamental mechanism to protect Canadians from the risks posed by facial recognition. Obligations to protect privacy by design, conduct privacy impact assessments, and ensure traceability with respect to automated decision-making are key elements of a meaningful accountability framework. While most of these accountability measures are part of the Department of Justice's proposals for modernizing public sector law, they are all absent from Bill C-11.

Efforts to regulate facial recognition technologies must also include robust compliance mechanisms that provide quick and effective remedies for individuals.

Our investigation into Clearview AI revealed that the organization had contravened two obligations under Canadian privacy law. On the one hand, it collected, used and disclosed biometric information without consent, and for an inappropriate purpose.

Remarkably—and shockingly—the new administrative penalty regime created by Bill C-11 would not apply to these and other important violations of the legislation. Such a penalty regime renders meaningless laws that are supposed to protect citizens.

I therefore urge you to amend the bill to remedy this fundamental flaw.

In conclusion, I would say that the nature of the risks posed by facial recognition technology calls for collective reflection on the limits of acceptable use of this technology. These limits should not be defined only by the risks associated with specific facial recognition initiatives, but by taking into account the aggregate social effects of all such initiatives over time.

In the face of ever-increasing technological capabilities to intrude on our private lives, we need to ask ourselves what are the expectations we should be setting now for the future of privacy protection.

I thank you again for your attention.

I welcome any questions you may have.

May 10th, 2021 / 11:50 a.m.
See context

NDP

Charlie Angus NDP Timmins—James Bay, ON

Thank you so much.

I want to begin by saying that I am in complete agreement with my colleague, Mr. Sorbara, on the importance of getting Bill C-11 right, because it is about the rights of 38 million Canadians, and we all have that obligation.

Our committee previously brought forward a number of recommendations about the order-making powers of the Privacy Commissioner as well as the need to be able to levy huge fines. The vast majority of infringements on privacy we believe are accidental or without malice, but we do have some bad operators. We had Facebook say they didn't feel they had to follow Canadian law. We certainly see the same instance with Clearview AI, so the need to give you more tools was clear.

What concerns me, when I look at Bill C-11, is this idea of creating this regulatory tribunal that these companies could then go to about a decision.

I'd like to ask you, number one, whether we have any example of this kind of regulatory tribunal that can override a privacy commissioner's decision in any other jurisdiction, and how you feel about it. You state you believe that this tribunal would encourage companies to choose a route of appeal rather than finding common ground with the Privacy Commissioner's decisions, and it would actually delay and obstruct justice for consumers and privacy rights.

Could you give your thoughts on this regulatory tribunal balloon that has been floated by the government?

May 10th, 2021 / 11:45 a.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

Obviously, it's a very broad question. I will try to do justice to it in a few seconds or minutes.

Consent is a fundamental aspect of the current law, PIPEDA, and it will continue to have a central role under the CPPA under Bill C-11, so there is a place for consent in privacy in 2021. There need to be some rules to make sure that when consent does work, it is obtained in a meaningful way. In my view, that means, in part, to ensure that the consumers who provide consent have a good idea of what they are consenting to, which is not obvious. That's where consent does work.

As I was saying in the documents you were referring to, given where we are with digital developments, there are many situations, a growing list of situations, where consent does not really work, particularly when you think of artificial intelligence, for instance, where the purpose of the technology is to use information for purposes other than that for which it was obtained. That's not really conducive to consent being an adequate means to protect privacy.

Given where we are in 2021, and the following years, there is a role for consent, but we also need to have laws that acknowledge that consent will not always work. Then we need to find an adequate means of protecting privacy absent consent. That's where the real difficulty, I think, lies in the discussion of these issues, particularly with Bill C-11.

Bill C-11 has many more exceptions to consent, some appropriate, others too broad in our view. How do you protect privacy if consent is not the preferred means of protecting it? We propose a human rights approach to privacy protection. Other models are proposed, such as the fiduciary model that Mr. Angus was referring to.

The extremely difficult challenge ahead of Parliament in the next few months is to determine where consent does not work—and it does not always work—and what would be a good model to continue to protect privacy adequately absent consent.

May 10th, 2021 / 11:35 a.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

Yes. There's no question that Bill C-11 is a comprehensive and serious attempt to address privacy issues in the digital world, but at the most general level, we think that in order to provide adequate protection for privacy, the bill needs very significant changes.

Why? In part it's because we think that even though there are provisions on consent in Bill C-11, the ultimate impact would be less control for individuals, in part for the reason you suggest, that a requirement in the current law that individuals, consumers, understand the consequences of what they are being asked to consent to does not exist in Bill C-11 as drafted.

There are also important exceptions to consent in Bill C-11, some of which are appropriate but others much too broad. For example, there is an exception to consent where it is “impracticable” to obtain consent. We think that such an extremely broad exception to consent makes the rule hollow—so less control for individuals, more flexibility in Bill C-11 for organizations. We're not against additional flexibility for organizations per se, particularly when organizations want to use personal information for the public good or for a legitimate public interest, but we think additional flexibility should come with additional accountability.

May 10th, 2021 / 11:30 a.m.
See context

NDP

Charlie Angus NDP Timmins—James Bay, ON

Thank you very much for that. I really appreciate it. I understand that you are investigating, so I won't ask you any more on that. I just wanted to clarify that.

In the last Parliament, our committee sent the government a number of recommendations on strengthening the role of your office and ensuring that we get stronger protections for Canadians' privacy rights, the rights of our citizens. I have spoken with many people in the privacy and data field who have looked at this new legislation, Bill C-11, and they're raising concerns that this legislation may actually hinder a number of the objectives that we had laid out at our committee in the previous Parliament. One of those is the issue of meaningful consent.

You state that the consumer privacy protection act “leaves out an important facet of our current legislation, the idea that meaningful consent requires that the person giving it understands the consequences of what they are consenting to.” You further state that you believe this law “would result in less consumer control than under the current law”.

Can you explain your concerns?

May 10th, 2021 / 11:30 a.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

Our assessment of risk based on what we observe in various settings is what determines it. In this context, the current and future legislation, as part of Bill C-11, requires us to investigate when complaints are referred to us.

Except in very rare cases, when a complaint is filed by an individual, the legislation requires us to investigate. This is a real constraint. Again, there are advantages to this system, particularly in terms of access to justice. We're an ombudsman with a relatively expedited process, one that is simpler than judicial tribunals.

I understand all of that, but the fact remains that it creates a real constraint because we have to investigate every complaint that comes in. We believe that, like other privacy regulators, we should have more flexibility. The question is what recourse there would be if the office were unable to investigate a complaint. One of the things Bill C-11 talks about is a private right of action before the courts.

These are sensitive issues, but having to investigate every complaint we receive is a real constraint.

May 10th, 2021 / 11:25 a.m.
See context

Bloc

Rhéal Fortin Bloc Rivière-du-Nord, QC

Leaving aside Bill C-11, as it stands now, would you say that you have the budget to carry out your mandate, apart from what may happen next with Bill C-11?

May 10th, 2021 / 11:25 a.m.
See context

Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada

Daniel Therrien

Actually, that money would be needed because of the additional mandate we would have under Bill C-11.