Digital Charter Implementation Act, 2020

An Act to enact the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act and to make consequential and related amendments to other Acts

This bill was last introduced in the 43rd Parliament, 2nd Session, which ended in August 2021.

Sponsor

Navdeep Bains  Liberal

Status

Second reading (House), as of April 19, 2021
(This bill did not become law.)

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

Part 1 enacts the Consumer Privacy Protection Act to protect the personal information of individuals while recognizing the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act. It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act, which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / 12:20 p.m.
See context

NDP

Brian Masse NDP Windsor West, ON

Madam Speaker, I applaud the minister for bringing the issue forward to Parliament. Again, I want to exercise some caution that the first two pieces of the legislation are much easier to deal with, because at least there was some discussion on those with Bill C-11. It is a bit different in this one, and the tribunal is an issue, but I am open to looking at it. I just have concerns about that. However, the artificial intelligence part of it is critical. I am glad it is in front of us, but it is going to require much more extensive debate and care, and that is why it should be entirely separate.

We in the NDP have proposed a fairly reasonable compromise, and the Speaker will rule on it. The proposed compromise is that there would be a separate vote for that particular part of the bill. The reason is that perhaps the first two parts could lead to a decision that might be different from the decision on the last part, just to ensure that we get enough testimony and time in committee for it.

I am looking forward to all perspectives in the House on this. It is time for us to look at that. It is a reasonable position, and I am glad it is in front of us. I do not like the way it is in front of us, but we will deal with that.

Digital Charter Implementation Act, 2022Government Orders

November 28th, 2022 / noon
See context

NDP

Brian Masse NDP Windsor West, ON

Madam Speaker, I am happy to start this week by speaking to Bill C-27. It is quite an extensive bill at over 140 pages in length. It would amend several acts and the most consequential are three of them in particular, as it is an act to enact the consumer privacy protection act, the personal information and data protection tribunal act and the artificial intelligence and data act and to make consequential and related amendments to other acts.

I should start by saying that this is really three pieces of legislation that have been bundled up into one. As New Democrats, we have called for different voting for the third and final part of this act.

The first two parts of the act, concerning the consumer privacy protection act and the personal information and data protection tribunal act, do have enough common themes running through them to be put together into one piece of legislation. I still think, for these issues, that they would have been better as two separate pieces of legislation because one of them is brand new and the first one, the consumer privacy protection act, is the former Bill C-11, which was highly controversial in the previous Parliament.

When we had an unnecessary election called by the Prime Minister, that bill died, along with all of the work from Parliament, which was not concluded, despite extensive lobbying and consultation going, particularly, through the ethics committee at that time. This has now been bundled with some other legislation to go through the industry committee, which is fine.

The personal information and data protection tribunal act is a new component of this legislation. I have some concerns about that element of it, but it does have a common theme, which is worthwhile, and at least it has the potential to be put together and bundled. Although, again, it is extensive, it is a bundling that we can accept.

We have called for a Speaker's ruling with regard to the artificial intelligence and data act, as this is brand new legislation as well, but it does not have the same connections as the previous two pieces, which are bundled together, in the way that one could could argue for them. We want a separate vote on the second part of this because the legislation would be studied at committee together.

There will be a high degree of interest in this legislation, since Bill C-11 had that in the past. The new bill changes position from Bill C-11 significantly, and I expect that this in itself will garner a lot of chatter, as well as review and interest, from a number of organizations, many of whom we have already heard from as of now.

The other part, with the tribunal, would be another important aspect, because it is a divergence from our traditional way of enforcement and creates another bureaucratic arm. Again, I would like to see more on this, and I am open to considering the idea, but it is certainly different from our traditional private right of law for dispute settlements about data breaches and other types of corporate malfeasance, that actually have to deal with the types of laws that are necessary to bring compliance among people.

This goes to the heart of, really, where a political party resides in their expectations of companies and their use of data, information and algorithms. For New Democrats, we fall very much in line with something I have tabled before, several years ago, which is a digital bill of rights, so that one's personal rights online are consistent with that of our physical rights, where one is expected to be properly treated in a physical world and in the digital format world. That includes one's right to privacy, right to the expectation of proper behaviour conducted toward oneself and right to not be abused. It also includes significant penalties to those who do those abuses, especially when we are looking at the corporate world.

Where this legislation really becomes highly complicated is the emergence of artificial intelligence, which has taken place over the last decade and will be significantly ramped up in the years to come. That is why the European Union and others have advanced on this, as well as the United States.

Our concern is that this bill tries to split both worlds. We all know that the industries of Google and other web giants have conducted significant lobbying efforts over the last number of years. In fact, they have tripled their efforts since this administration has come into place and have had a direct line of correspondence about their lobbying, which is fine to some degree, but the expectation among people that it would be balanced does not seem to be being met.

I want to bring into the discussion the impact on people before I get into the technical aspects of the bill, as well as the data breaches that remind us of the need for protection among our citizens and other companies as well. One of the things that is often forgotten is other SMEs, and others can be compromised quite significantly from this, so protecting people individually is just as important for our economy, especially when we have the emergence of new industries. If they are behaviours that are hampered, manipulated or streamed, they can become significant issues.

I want to remind people that some of the data breaches we have had with Yahoo, Marriott, the Desjardins group and Facebook, among others, have demonstrated significant differences in the regulatory system between Canada and the United States and how they treat their victims. A good example is a settlement in the U.S. from 2009 with the Equifax data breach, where Equifax agreed to pay $700 million to settle lawsuits over the breach in agreement with the U.S. authorities, and that included $425 million in monetary relief to consumers. We have not had the same type of treatment here in Canada.

This is similar to the work I have done in the past with the auto industry and the fact that our Competition Bureau and our reimbursement systems are not up to date. We have been treated basically as a colony by many of the industries when it comes to consumer and retail accountability.

We can look at the example of Toyota and the data software issue, where the car pedal was blamed for the cars going out of control. It turned out this was not the case. It was actually a data issue. In the U.S., this resulted in hundreds of millions of dollars of investment into safety procedures. We received zero for that. Also, consumers received better treatment, where their vehicles were towed back to different dealerships to be fixed. In Canada, consumers did not receive any of that.

The same could be said with Volkswagen, another situation that took place with emissions. Not only did we not receive compensation similar to that of the United States, we actually imported a lot of the used Volkswagen vehicles from Europe. However, that was of our own accord and time frame when those vehicles were being sunsetted in those countries because of emissions.

In the case of Facebook, the U.S. Federal Trade Commission was able to impose a $5-billion fine for the company's violation, while the Privacy Commissioner's office was forced to take the company to federal court here in Canada. One of the things I would like to point out is that our Privacy Commissioner has stood up for the needs of Canadians, and one of the concerns with this bill would be the erosion of the Privacy Commissioner's capabilities in dealing with these bills and legislation.

The Privacy Commissioner has made some significant points on how to amend the bill and actually balance it, but they have not all been taken into account. One of the strong points we will be looking to is to see whether there are necessary amendments from our Privacy Commissioner on this.

One of the big distinctions between Canada and the United States, which is to our benefit and to Canada's credit, is the office of the Privacy Commissioner. Where we do not have some of the teeth necessary for dealing with these companies, we do have the independent Privacy Commissioner, who is able to investigate and follow through at least with bringing things to a formal process in the legal system. It is very labourious and difficult, but at the same time, it is independent, which is one of the strengths of the system we have.

If the government proceeds, we will see the bill go to committee, which we are agreeing to do. However, we do want to see separate voting. Before I get into more of the bill, I will explain that we want to see separate voting because we really distinguish that this is inappropriate. The artificial intelligence act is the first time we have even dealt with this topic in the House of Commons, and it should be done differently.

We will be looking for amendments for this, and big corporate data privacy breaches are becoming quite an issue. Some of these privacy breaches get highly complicated to deal with. There have been cases with cybersecurity and even extortion. The University of Calgary is one that was well noted, and there have been others.

We need some of these things brought together. The bill does include some important fixes that we have been calling for, such as stronger enforcement of privacy rights, tough new fines, transparency in corporate decisions made by algorithms.

I have pointed out a lot of the concerns that we have about the bill going forward because of its serious nature. However, we are glad this is happening, albeit with the caveat that we feel the bill should be separate legislation. The minister does deserve credit for bringing the bill forward for debate in the House of Commons.

Bill C-11 should have been passed in the last Parliament, but here we are again dealing with it. The new tribunal is the concern that we have. It could actually weaken existing content rules, and we will study and look at the new tribunal.

The tribunal itself is going to be interesting because it would be an appointment process. There is always a concern when we have a government appointment process. There is a concern that there could be complications setting up the tribunal, such as who gets to go there, what their background is, what their profession is and whether there will be enough support.

One of the things that gives me trouble is that the CRTC, for example, takes so long to make a decision. It is so labourious to go through and it has not always acted, most recently, in the best interest of Canadians when it comes to consumer protection and individual rights. It gives me concern that having another tribunal to act as a referee instead of the court system could delay things.

Some testimony has been provided already, some analysis, that suggests the tribunal might end up with lawsuits anyway, so we could potentially be back to square one after that. The time duration, funding, the ability to investigate and all these different things are very good issues to look at to find out whether we will have the proper supports for a new measure being brought in.

Another government resource for this is key. At the end of the day, if it is a tribunal system that is not supportive of protecting Canadians' privacy and rights, then we will weaken the entire legislation. That is a big concern because that would be outside Parliament. The way that some of the amendments are written, it could be coming through more regulatory means and less parliamentary oversight.

Who is going to be on the tribunal? How will it be consistent? How will it be regulated? I would point to the minister providing the CRTC with a mandate letter, which is supposed to emphasize the public policy direction it should be going. In my assessment, the CRTC, over the last number of years, has not taken the consumer protection steps that New Democrats would like to see.

When it comes to modernizing this law, we do know that this will be important to address because there are issues regarding the data ownership, which is really at the heart of some of the challenges we face. There is algorithmic abuse and also areas related to compensation, enforcement, data ownership and control, and a number of things that are necessary to ensure the protection of people.

We can look at an area where I have done a fair amount of work related to my riding, which is automobile production. There has been the production of the car and the value there, but there will also be the data collection. The use of that data collection can actually influence not only one's individual behaviour, but also that of society. That is a significant economic resource for some of these companies.

It is one of the reasons I have tabled an update to my bill on the right to repair. The right to repair is a person's ability to have their vehicle fixed at an auto shop of their choice in the aftermarket. The OEMs, the original manufacturers, have at times resisted this. There have been examples. Tesla, for example, is not even part of what is called the voluntary agreement, but we still do not have an update with regard to the use of data and how one actually goes about the process of fixing the vehicle.

It also creates issues related to ownership of the vehicle, as well as insurance and liability. These could become highly complicated issues related to the use of data and the rules around it. If these types of things are not clear with regard to the process of rights for people, expectations by those who are using the data, and protection for people, then it could create a real, significant issue, not only for individuals but for our economy.

Therefore, dealing with this issue in the bill is paramount. A lot of this has come about by looking at what the GDPR, the general data protection regulation, did in European law. Europe was one of the first jurisdictions to bring forth this type of an issue, and it has provided an adequate level of protection, which is one of the things Europe stands by with regard to protection of privacy. There have been some on the side over here in North America who have pushed back against the GDPR, and even though this landmark legislation has created a path forward, there still is a need for transparency and to understand what the monetary penalties for abuse are going to be, which are also very important in terms of what we expect in the legislation.

Erosion of content rights is one of the things we are worried about in this bill. Under Bill C-27 individuals would have significantly diminished control over the collection, use and disclosure of their personal data, even less than in Bill C-11. The new consent provisions ask the public to install an exemplary amount of trust to businesses to keep them accountable, as the bill's exceptions to content allow organizations to conduct many types of activities without any knowledge of the individuals. The flexibility under Bill C-27 allows organizations to state the scope not only of legitimate interests but also of what is reasonable, necessary and socially beneficial, thus modelling their practices in a way that maximizes the value derived from the personal information.

What we have there is that the actors are setting some of the rules. That is one of the clearer things that we need through the discussion that would take place at committee, but also from the testimony that we will hear, because if we are letting those who use and manage the data make the decision about what consent is and how it is used, then it is going to create a system that could really lead to abuse.

There is also the issue or danger of de-identification. Witnesses, artificial intelligence and people being able to scrub much of their data when they want and how they want is one of the things we are concerned about. There is not enough acknowledgement of the risk that is available in this. That includes for young people. We believe this bill is a bit lopsided towards the business sector at the moment, and we want to propose amendments that would lead to better protection of individual rights and ensure informed consent as to what people want to do with their data and how they want it to be exercised as a benefit to them and their family, versus people being accidentally or wilfully brought into exposure they have not consented to.

As I wrap up, I just want to say that we have a number of different issues with this bill. Again, we believe there should be a separate vote for the second part of this bill, being the third piece of it. It is very ambitious legislation. It is as large as the budget bill. That should say enough with regard to the type of content we have. I thank the members who have debated this bill already. It is going to be interesting to get all perspectives. I look forward to the work that comes at committee. It will be one that requires extensive consultation with Canadians.

Division for Vote on Bill C‑27Points of OrderRoutine Proceedings

November 22nd, 2022 / 10:15 a.m.
See context

NDP

Peter Julian NDP New Westminster—Burnaby, BC

Mr. Speaker, I rise today on a point of order regarding government Bill C-27, an act to enact the consumer privacy protection act, the personal information and data protection tribunal act and the artificial intelligence and data act and to make consequential and related amendments to other acts.

Standing Order 69.1 states the following:

(1) In the case where a government bill seeks to repeal, amend or enact more than one act, and where there is not a common element connecting the various provisions or where unrelated matters are linked, the Speaker shall have the power to divide the questions, for the purposes of voting, on the motion for second reading and reference to a committee and the motion for third reading and passage of the bill. The Speaker shall have the power to combine clauses of the bill thematically and to put the aforementioned questions on each of these groups of clauses separately, provided that there will be a single debate at each stage.

You will find that, in the case of Bill C-27, the bill enacts three new laws and amends several other existing laws.

Bill C-27 enacts the consumer privacy protection act and the personal information and data protection tribunal act.

These two acts were at the core of the former Bill C-11 in the 43rd Parliament, a bill that was introduced in November 2020 and died on the Order Paper a year later, without ever having been voted on at second reading.

Here is the purpose of part 1 of Bill C-27, as described in the text of the bill:

The purpose of this Act is to establish — in an era in which data is constantly flowing across borders and geographical boundaries and significant economic activity relies on the analysis, circulation and exchange of personal information — rules to govern the protection of personal information in a manner that recognizes the right of privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.

Part 2 of the bill sets up the personal information and data protection tribunal, which would have jurisdiction with respect to appeals made under different sections of the consumer privacy protection act. The link between part 1 and part 2 of Bill C-27 is clear, and I am not putting it into question in this appeal at all.

Where we have an issue, however, is with the third part of the bill.

Bill C‑27 also enacts the artificial intelligence and data act, which was not part of Bill C‑11, the previous version of this bill.

The purpose of part 3 of Bill C‑27, which enacts the artificial intelligence and data act, is as follows:

The purposes of this Act are:

(a) to regulate international and interprovincial trade and commerce in artificial intelligence systems by establishing common requirements, applicable across Canada, for the design, development and use of those systems; and

(b) to prohibit certain conduct in relation to artificial intelligence systems that may result in serious harm to individuals or harm to their interests.

During his second reading speech on Bill C‑27, the Minister of Innovation, Science and Industry said that the new artificial intelligence act would “set a foundation for regulating the design, development, deployment and operations of AI systems”.

The development of artificial intelligence systems in the past decade has led to profound changes in the way we do things. Regulating AI systems is something we believe must be done. However, it seems odd to add these regulations to a bill that has to do with privacy protection and with the analysis, circulation and exchange of personal information. Artificial intelligence is its own beast in a way, and it should be studied and treated separately.

In a ruling by Speaker Regan on March 1, 2018, he said the following.

The principle or principles contained in a bill must not be confused with the field it concerns. To frame the concept of principle in that way would prevent the division of most bills, because they each apply to a specific field.

The House leader of the Bloc Québécois and member for La Prairie will remember this, since it is from page 400 of Parliamentary Procedure in Québec.

The Speaker continued as follows:

While their procedure for dividing bills is quite different from ours, the idea of distinguishing the principles of a bill from its field has stayed with me. While each bill is different and so too each case, I believe that Standing Order 69.1 can indeed be applied to a bill where all of the initiatives relate to a specific policy area, if those initiatives are sufficiently distinct to warrant a separate decision of the House.

We find ourselves in a similar situation here. While some of the measures in Bill C-27 relate to digital technology, part 1 and part 2 have nothing in common with part 3.

Therefore, it would certainly be appropriate to divide this bill for the vote. The Speaker has that authority, and that would make it possible for members to thoroughly study this legislative measure and better represent their constituents by voting separately on these bills, which are quite different from one another.

Digital Charter Implementation Act, 2022Government Orders

November 4th, 2022 / 1 p.m.
See context

Bloc

René Villemure Bloc Trois-Rivières, QC

Mr. Speaker, I would like to begin by giving a shout out to my constituents in Trois-Rivières, whom I will be visiting all next week in my riding.

When I talk to people on the street, privacy is a topic that comes up a lot. They know that I sit on the Standing Committee on Access to Information, Privacy and Ethics, and privacy comes up often. People tell me that it is important, that we must do our best to rise to the challenge. Today, we have the opportunity to debate that very subject.

Society is a human construct. It is a reflection of how we organize our lives together. It reflects our vision of the world, the role of a citizen, the role of the state. In a democratic society where elected officials are chosen by the people to represent them, our laws must reflect our desires and the desires of our fellow citizens, as well as the way in which their visions can be realized. In other words, a society and its laws are eminently cultural constructs.

When we compare the legislation passed in the House of Commons with that of the Quebec National Assembly, the difference is striking. Ottawa tends to emphasize the enforcement mechanism, whereas in Quebec, the emphasis is on the legislator's intent. Ottawa wants to arbitrate, while Quebec wants to prescribe and guide.

When it comes to privacy, this is especially true in the digital age: the difference is dramatic.

At one end of the spectrum, so to speak, is the United States. In the United States, laws are primarily intended to arbitrate disputes rather than to shape how the digital economy operates. Laws are based on the good faith of the players and on voluntary codes. As one might imagine, this has its limits. Ultimately, if someone is wronged, they can get redress through the common law.

At the other end of the spectrum is the European Union. The legislation there prescribes clear obligations. I am referring to the General Data Protection Regulation, better known by the acronym GDPR.

In between is Canada, a hybrid creature whose intentions on privacy oscillate between the European and American extremes. This may seem like an academic debate, but there are practical implications that bring us to Bill C-27.

When it comes to privacy, European law is the most prescriptive in the world. It is based on a clear principle, namely that our personal information belongs to us and us alone, and no one can use it or benefit from it without our free, informed and explicit consent.

Once the government set out that principle or objective, it then provided a mechanism for achieving it. That mechanism is the GDPR. The GDPR is becoming the standard to follow when it comes to privacy, because it is the legal standard with the clearest objectives and the most binding application. Simply put, the GDPR does a good job of protecting privacy. That is one reason why it is the standard we should be emulating; the other is that the EU is projecting its standard-making power beyond its borders.

In order to protect the personal information of European citizens, the European Union will soon prohibit European businesses from sharing this information with foreign businesses that do not offer comparable protection. This does not affect us yet, but next year, the EU will be reviewing Canada's laws to see if they offer sufficient protection.

The existing legislation on personal electronic information protection dates back to 2000. That was 22 years ago. We were in the dinosaur era, the pre-digital era, an era we barely remember now. Also, it is far from clear whether Canada passes the comparable protection test required under the GDPR.

Information exchanges between Canadian businesses and their European partners could become more complicated. This is particularly true in areas that deal with more sensitive information, such as the financial sector. It is therefore absolutely necessary to redraft the Personal Information Protection and Electronic Documents Act, which is completely outdated. It has not kept pace with technological change and the data economy, where we are both the consumer and the product. It has not kept pace with the legal environment, where Canada is a dinosaur compared to Europe, as I was just saying.

Nevertheless, my colleagues will have figured out that the Bloc Québécois is in favour of the principle of Bill C‑27. Nevertheless, I would like to make a general comment about Bill C‑27. For some reason, the government has put into one bill two laws with completely different objectives. The bill would enact the consumer privacy protection act and also the artificial intelligence and data act. Although there is a logical link between these two acts, they could be stand-alone bills. Their objectives are different, their logic is different and they could be studied separately.

I have a suggestion for the government. It should split Bill C‑27 into two bills. We could create what I would call the traditional Bill C‑27, which would deal with personal information and the tribunal. Then, what I would call Bill C‑27 B would address artificial intelligence. As I was saying, there are logical reasons for that, but there are also practical reasons. Let me be frank and say that the artificial intelligence act being proposed is more of a draft than a law. The government has a clear idea about the mechanism for applying it, but, clearly, it has not yet wrapped its head around the objectives to be achieved and the requirements to be codified.

The mechanism is there, the bureaucratic framework is there, but the requirements to be complied with are not. Apart from a few generalities, the law relies essentially on self-regulation and the good faith of the industry. I have often faced these situations, and I can say that the industry's good faith is not the first thing I would count on.

Apart from a few generalities, this relies on good faith, but that is not a good way to protect rights. I am not convinced that this bill should be passed as written; I think it needs to be amended. Bill C‑27 probably deserves the same fate that Bill C‑11, its predecessor, encountered in the last Parliament. The government introduced it, debate got under way, criticism was fierce, and the government let it die on the Order Paper so it could keep working on it and come back with a better version. I think that is exactly what should happen to the artificial intelligence act.

The government has launched a healthy discussion, but this is not a finished product. If we decide that the government needs to keep working on it and come back with a new version, we will also be delaying the modernization of privacy and personal information legislation. Given the European legislation, which I talked about earlier, that is not what the government wants to do. That is why I would cordially advise the government to split Bill C‑27.

I am going to focus primarily on personal information protection because that is the part of Bill C‑27 that is ready to go and has the most practical applications. As I said before, Bill C‑27 is an improved version of Bill C‑11, which was introduced in the fall of 2020.

However, Bill C-27 still does not establish privacy as a fundamental right. Bill C-11 was strong on mechanics, but weak on protection. The principles were also weak and consent was unclear. It was tough on large corporations and much less so on small businesses. When it comes to privacy, however, it is the sensitivity of the data that should dictate the level of protection, not the size of the company.

A new start-up that develops an app that aggregates all of our banking data, for example, may have only two employees, but it still possesses and handles extraordinarily sensitive information that must be protected as much as possible. I cannot help but think of the ArriveCAN app, which was developed by just a few people but has a large impact on the data that is stored.

Finally, Bill C-11 did not provide for any harmonization with provincial legislation, such as Quebec's privacy legislation. The Bloc Québécois was quite insistent on that. A Quebec company subject to Quebec law would also have been subject to federal law as soon as the data left Quebec. It would have been subject to two laws that do not say the same thing and have two different rationales. This would mean duplication and uncertainty. It was quite a mess. Passing Bill C-11 would have diminished, in Quebec at least, the legal clarity that is needed to ensure that personal information is protected.

Here is what Daniel Therrien, the then privacy commissioner, told the Standing Committee on Access to Information, Privacy and Ethics, of which I am honoured to be a member. He said, and I quote, “I believe that C-11 represents a step back overall from our current law and needs significant changes if confidence in the digital economy is to be restored.”

He proposed a series of amendments that would make major changes to the bill. I want to commend the government here. It listened to the criticism. It is rare for this government to listen, but it did so in this case. It buried Bill C-11. We never debated it again in the House and it died on the Order Paper. It reappeared only after being improved.

Bill C-27 shows more respect for the various jurisdictions and avoids the legal mess I was talking about earlier.

Our personal information is private and it belongs to us. However, property and civil rights fall exclusively under provincial jurisdiction under subsection 92(13) of the Constitution of 1867.

What is more, privacy basically falls under provincial jurisdiction. That is particularly important in the case of Quebec, where our civil law tradition leads us to pass laws that are much more prescriptive.

Last spring, Quebec's National Assembly passed Bill 25, an in-depth reform of Quebec's privacy legislation. Our law, largely inspired by European laws, given that we share a legal tradition, is the most advanced in North America. As we speak, it is clear that Quebec has exceeded the European requirements and that our companies are protected from any hiccups in data circulation.

Our principles are clear: Our personal information belongs to us. It does not belong to the party who collected it or the party who stores it. The implication is clear. No one can dispose of, use, disclose or resell our personal information without our free, informed and express consent. Bill C-11 challenged this legal clarity but Bill C-27, at the very least, corrects that.

Under clause 122(2) of Bill C‑27, the government may, by order, “if satisfied that legislation of a province that is substantially similar to this Act applies to an organization, a class of organizations, an activity or a class of activities, exempt the organization, activity or class from the application of this Act in respect of the collection, use or disclosure of personal information that occurs within that province;”.

In other words, if Quebec's legislation is superior, then Quebec's legislation will apply in Quebec.

When I met with the minister's office earlier this week, I asked for some clarification just in case. Will a Quebec business be fully exempt from Bill C‑27, even if the information leaves Quebec? The answer is yes. Will it be exempt for all of its activities? The answer is yes.

There is still some grey area, though. I am thinking about businesses outside Quebec that collect personal information in Quebec. In Europe, it is clear. It is the citizen's place of residence that determines the applicable legislation. The same is true under Quebec's legislation.

It is not as clear in Bill C‑27. Since the bill relies on the general regulation powers for trade and commerce as granted by the Constitution, it focuses more on overseeing the industry than on protecting citizens. That is the sort of thing we will have to examine and fix in committee. I look forward to Bill C‑27 being studied in committee so we can debate the substance of the bill.

I have to say that I sense the openness and good faith of the government. In that regard, I would like to tell the member for Kingston and the Islands to take note that, for once, I feel he is working in good faith.

Bill C‑27 will have a much greater impact outside Quebec than within it, because it is better drafted than Bill C-11. That is not the only aspect that was improved. The fundamental principles of the bill are clearer. Consent is more clearly stated. The more sensitive data must be handled in a more rigorous manner, no matter the size of the entity holding them. That is also more clear.

If the principles are clear, the act will better stand the test of time and adjust to the evolving technologies without becoming meaningless.

We will support it at second reading after a serious debate, but without unnecessary delays. However, we believe and insist that the real work must be done in committee. Bill C-27 is complex. Good principles do not necessarily make good laws. Before we can judge whether Bill C-27 is indeed a good law, we will need to hear from witnesses from all walks of life.

When it comes to privacy, it only takes one tiny flaw to bring down the whole structure. This requires attention to detail and surgical precision. The stakes are high and involve the most intimate part of our lives: our privacy.

For a long time, all we had to do to maintain our privacy was buy curtains. That is how it used to be. It kept us safe from swindlers. Then organizations started collecting data for their records. Bankers collected financial information, the government collected tax information and doctors collected medical records. This sensitive information had to be protected, but it was fairly simple, since it was written on paper.

Today, we live in a different world. Whereas personal information used to be a prerequisite for another activity, such as caring for a patient or getting a loan from a bank, it has become the core business of many companies. Information has become the core business of many companies, which are also large companies.

Computerization enables the storage and processing of astronomical volumes of data, also known as big data. Networking that data on the Internet increases the amount of available data exponentially and circulates it around the globe constantly, sometimes in perpetuity, unfortunately.

For many corporations, including web giants, personal data is crucial to the business model. Citizen-consumers are now the product they are marketing. To quote Daniel Therrien once again, we are now in the era of surveillance capitalism. Speaking of which, The Great Hack on Netflix is worth seeing. This is troubling.

Furthermore, for our youngest citizens, the virtual world and the real world have merged. Their lives are an open book on Instagram, Facebook and TikTok. They think they are communicating with the people who matter to them, but they are in fact feeding the databases that transform them into a marketable, marketed product. We absolutely have to protect them. We need to give them back control over their personal information, which is why it is so important to amend and modernize our laws.

I would like to close my speech with an appeal to the government. Bill C‑27 does a lot, but there are also many things it does not do, or does not do properly. Consent is all well and good, but what happens when our data is compromised, when it has been stolen, when it is in the hands of criminals? These people operate outside the law and therefore are not governed by the law. All the consent-related protocols we can think of go out the window. To avoid fraud and identity theft, we will have to clarify the measures to be taken to ensure that anyone requesting a transaction is who they say they are. This really is a new dynamic. In that respect, we are somewhat in the dark, even though, curiously, this is a growing problem.

There is another gap to fill. Bill C‑27 provides a framework for the handling of personal information in the private sector, but not in the public sector. The government is still governed by the same old legislation, which dates back to the pre-digital era. The legislation is outdated, as we saw with the fraud related to the Canada emergency response benefit. The controls are also outdated. I therefore call on the government to get to work and to do so quickly. We will collaborate.

Finally, there is another thing the government needs to work on and fast. We addressed this issue in committee when we were looking at the geolocation of data. Bill C‑27 indicates what we need to do with personal data, nominative data. However, with artificial intelligence and cross-tabulation of data, it is possible to recreate an individual based on anonymous information. As no personal information was collected at the outset, Bill C‑27 is ineffective in these cases. However, we started by recreating the profile of a person with all their personal information. It is not science fiction. It is already happening. Nevertheless, this is missing from Bill C‑27, both in the part on information and the part on artificial intelligence.

I am not bringing this up as a way of opposing Bill C‑27. As I said, we will support it. However, we have to be aware of the fact that it is incomplete. As legislators, we still have some work to do. The time has come to treat privacy as a fundamental right.

Digital Charter Implementation Act, 2022Government Orders

November 4th, 2022 / 12:50 p.m.
See context

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Mr. Speaker, once again, I am disappointed. I guess the Liberals and NDP do not really care about privacy rights for children, which we are talking about today. This is fundamental to the bill.

The minister did a lot of hard work putting this bill together and there have been a lot of consultations. This is the second iteration. Bill C-11 died only because there was an election called. Now we have Bill C-27, which is very serious. It talks about the rights of our children and Canadians that have been trampled on. I gave a lot of different examples where we just have not gotten it right in protecting children.

I am surprised that the NDP also does not seem to think that privacy is a fundamental right and something that we should protect. The Conservatives will certainly protect it. We are the only ones speaking about it today.

Digital Charter Implementation Act, 2022Government Orders

November 4th, 2022 / 12:30 p.m.
See context

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Mr. Speaker, 34 years ago, the Supreme Court said that “privacy is at the heart of liberty in a modern state”. In the words of Justice Gérard La Forest of the Supreme Court of Canada in 1988, it is worthy of an individual and “it is worthy of constitutional protection”. All Canadians are worthy of having their privacy respected.

It is our duty as parliamentarians to do our best to protect Canadians' privacy rights, especially as we struggle so much for it today.

Bill C-27, formerly Bill C-11, is designed to update Canada’s federal private sector privacy law, the Personal Information Protection and Electronics Documents Act, or PIPEDA, to create a new tribunal and to propose new rules for artificial intelligence systems. It is a reworking of Bill C-11, and it has three components: the consumer privacy protection act; the personal information and data protection tribunal act, creating a new tribunal; and the artificial intelligence and data act.

The bill applies to Canadians' private rights. It does not apply to CSIS, RCMP or CSE. That and other government-held data is governed by the Privacy Act. Privacy laws for Canadians have not been updated in 22 years, and Europe updated the General Data Protection Regulation in 2016.

When we last updated this act, 22 years ago, the member for South Shore—St. Margarets was turning 21 years old, and society was going through big changes. The world had just gotten past the Y2K scare. We were looking at what was going to happen to computers when the clock changed from 1999 to 2000. In certain areas, we did not know if the power would go out or what would happen.

People listened to music on CD Walkmans. Apple was over a year away from launching a cutting-edge new technology called the iPod. Less than 30% of Canadians actually owned a cellphone. The most popular cellphones were the Motorola Razr, which was a flip phone, and the Nokia brick phone, with texting that used the number pad and almost no web browsing capabilities. The most sophisticated app was called Snake. A fledgling Canadian telecommunications company was just starting, and it was called BlackBerry.

That is how long it has been since we updated our laws. Today, 22 years later, data collection is getting more sophisticated, and surveillance is more of the norm than the exception.

Apple Watch announced a few weeks ago that it can track and tell when a woman is ovulating. What is concerning, and we are going to talk a lot about data for good and data for wrong, is that this technology can tell if a woman skips a cycle, and then can identify if there has been a miscarriage or an abortion. This is very concerning.

Our Fitbits, our web history and our Apple phones can tell us how many steps we did in a day. Sometimes when we are in Parliament it is about 10, and if we are door knocking it is about 25,000. That does not sound important, but that information is also letting those regulators know where we have been, where we are going and where we live.

Facial recognition technology can identify a face like a fingerprint. Sometimes that is good. We have heard from law enforcement that it can be used for human trafficking. Sometimes that is wrong, when people are identified in a street and when people are identified with their names, their data and where they have been. Let us think of Minority Report, where everywhere someone goes, they are identified. It did not matter where they where going or where they had been. That is something that could happen with facial recognition technology.

Google and Amazon listen and collect our data in our bathrooms, living rooms, kitchens and cars. How many times have we been in conversations and Siri asks, “What was that?” Siri is always listening. Amazon is always listening. Speaking of cars, they are cellphones on wheels. When we connect to a rental car, and a lot of us rent cars, we see five or six other phones in the history. That car has downloaded all the data from our phone into that car. A lot of times, if we see that in the rental car, that car holds our information. It is very concerning.

There are many examples where it has hurt Canadians in the last several years. Two summers ago, Tim Hortons had a data breach, where every time someone rolled up the rim, it told Tim Hortons where they went afterwards, if they went home or where they were staying. It collected all that data, and it was a big problem.

In the ethics committee, we studied facial recognition technology. There was a company called Clearview AI, which took two billion images off the Internet, including a lot of ours, and just gave them to the police. There was no consent. The information just went and ended up in the hands of law enforcement.

There is Telus's “data for good”. During the pandemic, Telus collected our data. It knew where we went and if we went to the grocery store or the pharmacy, or if we stayed home. It just gave that to the government. It was called “data for good”. They called it de-identification. I am going to talk about how that hurt everyone later.

Lastly is doxing or using personal information to try to out people. GiveSendGo is a big one. It gave a U.S. company the information of people who donated to different causes or events. At one point, Google identified all those donors on a website showing exactly where they lived. Everyone's information, when they donated to a company, was identified and outed. That was terrible.

Surveillance has not just resulted in a wholesale destruction of privacy but a mental health crisis in children and youth as well. I am glad to hear the minister speak about children and youth because data has certainly affected them and continues to.

Canada’s federal government has repeatedly failed to take privacy seriously and construct a legal framework that protects the rights of Canadians in the digital age. This bill normalizes surveillance and treats privacy not as a fundamental human right and not even as a right to consumer protection. To make this point very clear, nowhere in the document for Bill C-27 does it state that privacy is a fundamental human right. However, this should be the crux of new legislation to update privacy laws, if not the outward premise, with the statement hammered from the preface until the end of Bill C-27 and following through the entire document. However, it is not there. It is nowhere and, therefore, holds no value.

This bill does not use that statement from the onset. It should be the pillar by which the bill is designed and led. Only a strong bill will ensure that Canadians' privacy rights are protected. Because of its omission, the bill is very weak, making it easier for industry players to be irresponsible with people's personal data. This is ironic as Canada has signed on to the UN Declaration of Human Rights and the International Covenant on Civil and Political Rights. That is where the bill starts and ends, with its failure to properly address privacy for Canadians.

Conservatives believe that Canadians’ digital privacy and data need to be properly protected. This protection must be a balance that ensures Canadians’ digital data is safe and that their information is properly protected and used only with their consent, while not being too onerous to be detrimental to private sector business. It is a balance.

Let us be clear. We need new privacy laws. In fact, it is essential to Canadians in this new digital era and to a growing digital future, but Bill C-27 needs massive rewrites and amendments to properly protect privacy, which should be a fundamental right of Canadians. The bill needs to be a balance between the fundamental right to privacy and privacy protection and the ability of business to responsibly collect and use data.

It also needs more nuance, but parts of this bill are far too vague. The definition of tyranny is the deliberate removal of nuance, so to create more equality or fairness on those privacy rights and to ensure businesses and AI use data for good, we need more nuance with more detail and more explanation, not less. There was a saying I used to love that my grandfather would say: “If you're going to do something, make sure you do it right or don't do it at all.”

Besides the omission of privacy rights as a fundamental right, the bill needs a massive rewrite. First, the bill doubles down on a flawed approach to privacy using a notice and consent model as its legal framework. The legal framework of Bill C-27 remains designed around a requirement that consent be obtained for the collection, use and disclosure of personal information, unless one of the listed exceptions to consent applies. Those exceptions are called “legitimate interest”.

What is scary about legitimate interest is that the businesses themselves will determine what legitimate interest means and what will be exempt. A quote on this from Canada’s leading privacy and data-governing expert, Teresa Scassa, says that this provision alone in the bill “trivializes the human and social value of privacy.” The legitimate interest provision allows Facebook, for instance, to build shadow profiles of individuals from information gathered from their contacts, even those with no Facebook access or accounts, without asking for their permission.

Have colleagues ever seen the “people you may know” feature on Facebook? Sometimes people turn up there, although one might not know where they had ever met and even though neither party is actually on Facebook. That is because Facebook builds profiles and shadow profiles from other members' contacts. Facebook has a feature that will suggest that one share their contacts: It will be great. People will give all their friends' information to Facebook: their emails, addresses and sometimes their private phone numbers. The U.S. found that information was turning up in Facebook. Here are a couple of examples. An attorney had a man recommended as a friend he might know who was a defence counsel on one of his cases, when they had only communicated though a work email. Another time, a man who donated sperm to a couple, secretly, had Facebook recommend their child as a person he should know, despite not having the couple, whom he once knew, on Facebook.

Legitimate interests needs more nuance. It needs to be more defined, or it is useless. Legitimate interests allow for too much interpretation. In other words, it allows something to be something unless it is not. It is far too broad.

Additionally, consent is listed as having to be “in plain language that an individual to whom the organization’s activities are directed would reasonably be expected to understand.” Bill C-27 makes it hard to determine what legitimate interests are, and that goes back to privacy as a Human Rights Commission complaint.

If we compare this section to the European Union's privacy law, the GDPR, which is, as the minister stated, the gold standard, the legitimate interest exemption is available unless there is an adverse effect on the individual that is not outweighed by the organization's legitimate interest, as opposed to the interest or fundamental freedom to the individual under the GDPR. If adverse effects on the individual can be data breaches, which are shocking and distressing to those impacted, and some courts have found that the ordinary stress and inconvenience of a data breach is not a compensable harm since it has been a routine part of life, probably for the last two years at least, then the legitimate interest exemption will be far too broad.

However, Bill C-27 would take something that was meant to be quite exceptional for consent in the European Union's privacy laws and make it a potentially more mainstream basis for the use of data without acknowledging consent. Why would it do this? It is because Bill C-27 places privacy on par with commercial interests in using personal data, something that would not happen if privacy was noted in the bill as a fundamental right for Canadians.

Additionally, we need to be wary of consent. As a mandatory, consent should be made easier. Has anyone ever looked at their iPhone when agreeing to consent and scrolled down? Has anyone actually read all that? Has anyone read Google's 38 pages of consent every time they sign up or use Google?

Consent is not easy. It is not simple, and certainly this proposed law would not make it any simpler. We need to be wary of consent, and we need to ensure that consent is consensual, both in language and intent, and that we all know exactly what we are signing up to do, to give and to receive.

There is another term I want to explain as well called “de-identification”. The bill talks a lot about de-identification, and its definition is that it “means to modify personal information so that an individual cannot be directly identified from it,” and then goes on to say “a risk of the individual being identified remains.” Therefore, an individual would lose all their information, but a risk of identifying an individual would remain.

Members will remember my Telus data for good example. Telus gave this information to the government during COVID, even though a risk of the individual being identified remained. It should be scrapped, and instead we should be using the word “anonymize”, which is also in the bill. This is what the GDPR does. In the bill, it “means to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means.”

I would ask members which one they would prefer. Would they like to be re-identified, as there is a possibility, or would they like no identification by any means?

Another major flaw in Bill C-27 is the creation of a bureaucratic tribunal instead of giving the Privacy Commissioner more bite. The creation of a tribunal is a time-waster, and the Privacy Commissioner should be allowed to levy fines. The Privacy Commissioner should be given more power and more bite. This is unclear because the EU, the U.K., New Zealand and Australia do not have tribunals that mediate their fines for privacy violations. Furthermore, it would no doubt cause those who have had their privacy violated to have to wait for years for the right of action.

I will put this straight. First we would have the Office of the Privacy Commissioner, or OPC, make a ruling. Then the government said that it would have a tribunal, which could then reverse the ruling of the Privacy Commissioner, and then we would have the Supreme Court, which would be allowed to rule on the tribunal's ruling. We would have a decision, another decision and a third decision, and each one of them could be countered.

Let me guess how long it would take. What do members think it would take? Would it take 48 hours or six months? Right now, the average is one year for the Privacy Commissioner, and we could add another year for the tribunal plus another year for appeals.

I ask this: Is it fair to have the average Canadian who has had their data breached, with their limited resources, have to go up against Facebook and Amazon and then spend three years in court? Does this protect fundamental privacy rights? Is this not just adding another layer of government that we certainly do not need?

The absence of rights-based language in the bill might tip the scale away from people in Canada, and the OPC and the tribunal weigh the privacy interest of people against the commercial interests of companies. Again, what does this come back to? Privacy was not listed as a fundamental right of Canadians.

Lastly, the AI portion of this bill is a complete rewrite. It needs to be split into its own bill.

I want to commend the minister for bringing this forward. He wants to be the first one in the land to bring this part of the bill forward, but to be honest, consultations only started in June. We have met with many individuals who certainly have not had any input into this deal, and although AI is there, there are many parts missing.

First of all, its findings conclude that there will be no independent and expert regulator for automated decision systems, nor does it have a shell of a framework for responsive artificial intelligence regulation and oversight. Instead, it says that the regulations will be determined at some future date and decisions will come from the Minister of Innovation, Science and Economic Development or a designated official.

Again, part of this includes a new tribunal and puts decisions where they should not be, onto the government, with enforcement and decision-making by the minister or the minister's designated ISED official. This would be political decisions on privacy. Does everyone feel comfortable that we are now shifting from a tribunal to the government?

This part of the bill will shift all of that to the government, to the minister or his designate. It reminds me of the proclamation, “I'm from the government, and I'm here to help.”

There is no mention of facial recognition technology, also, in this part of the bill, despite reports that have come from the ethics committee, the examples I gave from before on FRT. Certainly, that is worth more study.

There are some parts of the bill that have good aspects and certainly ones we can get behind, including the protection of children's privacy. As a father, I know it is so very important. Our children now have access to all kinds of different applications on their phones, iPads and Amazon Fires.

Our children are being listened to and they are being surveilled. There is no question that businesses are taking advantage of those children and that is something that we definitely need to talk about.

The attempt to regulate AI, though, as I have stated, needs major revisions. Without a proper privacy statement, it does not have a balanced purpose statement establishing that the purpose of the CPPA is to establish rules for governing the protection of personal information in a manner that balances the right to privacy and the need for organizations to collect, use or disclose personal information.

We should be shooting beyond the European Union's privacy act, shooting to be the world leader in the balance of ensuring privacy protection and that businesses and industries use data for good. In doing so, they would attract investment and technology, all the while protecting Canadians' fundamental right to privacy.

Canada needs privacy protection that builds trust in the digital economy, where Canadians can use new technologies for good while protecting them from the bad, profiling, surveillance and discrimination. The minister said that he wants to seize the moment, that we need leadership in a constantly changing world. Most importantly, the minister said that trust has never been more important.

If we do not get this right, and if we do not make sure that privacy is a fundamental human right, and declare that in the document and build the document around that right, we are doing two things: We are not prioritizing Canadians' privacy, as we are certainly not putting privacy at the forefront of the bill, and we are certainly not showing leadership in an ever-changing world.

As I noted at the onset, the technologies of 22 years ago have changed so significantly. The technologies now are changing more significantly. In the next 22 years, we are going to have technologies that are more embedded, not less, in our lives. We will have AI that do good.

One of the stakeholders that we met with actually talked about AI for good. They talked about embedding AI into the government's system of passports. That might actually mean that we could get passports within 48 hours. Could we imagine that? Could we imagine imbedding technology for good into a system that would allow Canadians to get the things that they need more often?

We love technology. We want to embrace it. We just want to make sure that, number one, privacy is protected. We want to make sure that we do the hard work of building frameworks alongside Canadians' fundamental human right to privacy and being protected in equal balance with the economy, democracy and the rule of law. This bill does not do that, not yet.

Let us work to make sure we come back with a bill that does that.

Digital Charter Implementation Act, 2022Government Orders

November 4th, 2022 / 12:25 p.m.
See context

Liberal

François-Philippe Champagne Liberal Saint-Maurice—Champlain, QC

Mr. Speaker, it is a pleasure to see you in the big chair.

The answer to my hon. colleague's question is absolutely.

There are parents listening to us at home today. The greatest gift we could give children is to refer Bill C-27 to a committee so that the questions my colleague raised can be properly studied. What she said in her introduction is correct. There are three simple things behind Bill C‑27. First, we want to give individuals more control and power over their online information. Next, as a parent, I feel it is fundamental that there be better protection for our children in the digital age. Finally, it will regulate artificial intelligence so that it is used responsibly and serves the public.

I believe it is time to bring our 20-year-old legislation into the 21st century. That is a good thing, and it is what Canadians want. It may reassure my colleague to know that during the study of Bill C‑11, we listened to many experts and collected comments to ensure not only that we have a good law, but that we are among the best in the world and that we set an example on the international stage.

I am pleased to hear that, like me, my colleague thinks that the best gift we can give our young people before Christmas is to send Bill C‑27 to committee to get it passed as quickly as possible.

November 1st, 2022 / 12:20 p.m.
See context

Liberal

Ryan Turnbull Liberal Whitby, ON

One recent concern that stands out in my mind is Canada Proud tweeting @ElonMusk, hours after he became the owner of Twitter, to ask about Bill C-11, which we know was the subject of significant disinformation in the last election.

What role do social media companies have in being responsible actors during and leading up to elections?

June 13th, 2022 / 11:50 a.m.
See context

Nominee for the position of Privacy Commissionner, As an Individual

Philippe Dufresne

I think long-term challenges will focus on the digital innovations we're seeing, on making sure there is the legal framework and on making sure the OPC has the internal expertise to provide good advice on that, in terms of codes of conduct.

There have been some discussions on de-identification and the prevention of reidentification. What is appropriate? How do you accept it, and what kinds of mechanisms do you need to put in place so de-identification is accepted as such? Are you minimizing the risk of reidentification? This is fundamentally important to ensure there is that framework.

On artificial intelligence, more and more of these decisions are being made by algorithms using information, so how do you address that? There were some elements in the GDPR and Bill C-11 related to algorithmic transparency, understanding how those decisions are made and, ideally, being able to challenge those decisions. From a human rights standpoint, there were concerns raised about profiling, so how do you deal with this technology that is at an accelerating pace?

I think this is one of the challenges. Technology is accelerating very quickly, and legal amendments not as quickly. We need to find ways.

June 13th, 2022 / 11:45 a.m.
See context

Nominee for the position of Privacy Commissionner, As an Individual

Philippe Dufresne

Thank you.

Of course, I very much look forward to meeting the team and speaking with all the colleagues.

I've looked at the DPR. I think one thing that is top of mind is the fact that, with the extension order for the Privacy Act in July and the expansion of the mandate, there will be an influx of new cases. That is something I know the commissioner has asked for additional funding for. That's going to be something to follow up on.

It has also been stated that, if the new PIPEDA is modelled on Bill C-11—hopefully with improvements based on a lot of the comments made—it, too, would require a doubling of resources, as I think Commissioner Therrien mentioned.

These would be some of the immediate discussions I would have with the team.

Specifically, in terms of order-making power and what kind of structure is needed, Commissioner Therrien talked about adjudicators and so on. Those are some of the elements, as well as making sure the office is prepared to advise Parliament when this bill comes in.

June 13th, 2022 / 11:35 a.m.
See context

Conservative

Ryan Williams Conservative Bay of Quinte, ON

That sounds great.

I think the biggest criticism of Bill C-11 in the past has been its ability to stifle innovation as much as protect it. You said earlier to Mr. Bezan that privacy is not opposed to innovation and that we can have both. How do you think we can have both? What do you think is an appropriate balance of privacy and innovation?

June 13th, 2022 / 11:35 a.m.
See context

Nominee for the position of Privacy Commissionner, As an Individual

Philippe Dufresne

Some of the concerns that were raised, and there have been lots of comments made by the OPC, including recently to this committee, not necessarily looking back to Bill C-11 but anticipating, in terms of the new iteration, what some of those elements should be. The first one being a rights-based framework, so making sure this is a regime that is not exclusively based on consent and that it recognizes privacy as a fundamental right. Dealing with de-identified information is very important, and ensuring there are prohibitions on reidentification, as well as calibrating to make sure it doesn't fall outside of the law. Dealing with automated decisions and artificial intelligence, all of these new things that weren't present.

There were some discussions on Bill C-11 in terms of whether you needed a tribunal to review the commissioner's decisions in terms of penalties. The OPC took the position that it should not be and that it should be a final decision of the OPC, subject to a judicial review. This is going to be important to look at. I share the concerns in terms of delays, if you add layers of review that just make it longer before you have a final resolution. I share the concerns about the federal commissioner having perhaps less authority than provincial counterparts, but there were some other options that were raised in this discussion as to whether there could then be a direct appeal to the Federal Court of Appeal or a specialized tribunal.

The key point is to ensure that the OPC is able to operate within that regime effectively. There have been discussions in terms of resources. There's a concern that was raised in terms of the new powers or responsibilities for the commissioner to verify codes of practice. The commissioner, I think rightly, raised the fact that, if that's the case, there may need to be some discretion in terms of where you focus that work because, otherwise, it can very quickly take a lot of your resources.

This is something that I did at the Human Rights Commission. We adopted a public interest strategic litigation approach, where we would focus our key resources on the key cases that would have the biggest impact for Canadians, and that was very successful.

June 13th, 2022 / 11:35 a.m.
See context

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Thank you very much, Mr. Chair.

Through you, and to echo the sentiments from the rest of our colleagues, it sounds like we have the right person in the role. Thank you very much for coming today.

I wanted to get a bit more into the old Bill C-11. Privacy is obviously a lot harder to protect these days, because it is digital. You mentioned looking at consent, proportionality and the GDPR. Is there anything else you've seen in your work as a law clerk on the assessment of the old Bill C-11, and how effective it is? Do you see that modelling the GDPR from Europe at this point?

June 13th, 2022 / 11:30 a.m.
See context

NDP

Matthew Green NDP Hamilton Centre, ON

Do you have more to comment on Bill C-11? I'm glad you brought that up, because it's certainly one that we seem to have gotten bogged down on. I'm wondering if you would share any perspectives on Bill C-11, the former one.

June 13th, 2022 / 11:30 a.m.
See context

Nominee for the position of Privacy Commissionner, As an Individual

Philippe Dufresne

My main priorities are going to ensure that Canadians can have better understanding and better protection. The private sector law has expectations that may come first. Certainly it did with Bill C-11. It would be a priority to ensure that Canadians can participate in the digital economy. Canada's market—