:
I call the meeting to order.
Good afternoon. Welcome to meeting number 91 of the House of Commons Standing Committee on Industry and Technology.
Today's meeting is taking place in a hybrid format, as per the rules. Pursuant to the order of reference of Monday, April 24, 2023, the committee is resuming its study of Bill , an act to enact the consumer privacy protection act, the personal information and data protection tribunal act and the artificial intelligence and data act and to make consequential and related amendments to other acts.
I would like to welcome our witnesses today. We have, appearing as individuals, David Fraser, partner at McInnes Cooper; Éloïse Gratton, partner and national leader, privacy and data protection, at BLG, who is joining us by video conference; and Daniel Therrien, lawyer and former Privacy Commissioner of Canada. Ms. Gratton was my professor at Université de Montréal for a short time, so it's nice to see her again. Finally, from the Canadian Anonymization Network, we have Adam Kardash, partner, and Khaled El Emam, professor, both joining us by video conference.
Thank you all for being here today.
We are fortunate to have this panel for our study of Bill C‑27, so without further ado, I will turn the floor over to Mr. Fraser for five minutes.
:
Thank you very much, and thank you for your kind invitation to appear before this committee to assist in its important study of Bill .
I'm a partner in private practice at a law firm where I've been practising privacy law for 22 years. Most of my practice involves advising international businesses on complying with Canadian privacy laws. More often than not, they're trying to make their existing privacy programs, which they've developed in places like Europe and California, work in Canada. I also advise Canadian businesses, large and small, on compliance with these laws. I regularly advise organizations in connection with investigations and encounters with the Office of the Privacy Commissioner of Canada and his provincial counterparts.
I'm here in my own personal capacity, but obviously my work and opinions are informed by my experience working with my clients.
Now, I may come across as somewhat contrarian in saying this, but I actually think that PIPEDA works pretty well as it is. It was designed to be technologically neutral, based on existing principles that are largely embedded in Bill . One thing I've often said is that Bill takes PIPEDA and turns it up to 11.
I don't think the legislation's necessarily broken. I think the commissioner, over the past 22 years, has not necessarily exhausted all of his enforcement powers and authorities over that time.
I'd like to start by saying that I don't really like the name of the new statute. Canadians aren't simply consumers. This legislation applies to consumers. It also applies to certain employees in the federally regulated sector. It's a bit negative and dismissive. If we're wedded to the acronym CPPA, we could call it the “Canadian Privacy Protection Act”, but I don't think that actually affects its substance.
Now, I like PIPEDA, but over the last little while, it's been pretty clear that there's an emerging consensus in looking toward order-making powers and penalties and thinking they're desirable. In the course of this, I would ask the committee to consider that that requires a commensurate and appropriate increase and shift to greater procedural fairness than is currently in the bill.
Based on my experience, I'm of the view that the Privacy Commissioner potentially has a conflict in being a privacy advocate, a privacy educator, the privacy police, the privacy judge and the privacy executioner. Any determination of whether a violation of the CPPA has taken place and what penalties should be imposed should be carried out by an independent arm's-length tribunal, such as the Federal Court or the new tribunal. The commissioner can recommend a penalty and can take on the role of prosecutor, but ultimately the determination of whether or not a violation has taken place and whether or not a penalty should be imposed should be vested in an arm's-length body.
I think the recent Facebook case in the Federal Court is a bit of a cautionary tale. I'd be happy to talk more about that.
Children's privacy is obviously a very important theme in this particular piece of legislation. I agree with and appreciate the views of the government and the commissioner with respect to protecting the privacy of children.
One thing I'm a bit concerned about is that the current bill would be difficult to operationalize for businesses that operate across Canada. Whether or not somebody is a minor currently depends upon provincial law. That varies from province to province, and implementing consistent programs across the country would be difficult. I would advocate putting in the legislation that a minor is 18 years or below.
I would also suggest that there be a presumption that children under the age of 13 are not able to make their own privacy decisions and that their parents should be their substitute decision-makers by default.
For organizations that offer a general service to the public—like a car dealership, for example—there should be a presumption that all of their customers are adults, unless they know otherwise. If you have a website that's focused toward children, you know there are children in the audience and you have to calibrate your practices appropriately. Anything different might lead to mandatory age verification, which can be very difficult and raises its own issues.
Having been involved in investigations and in litigation involving privacy claims, I would suggest that the “private right of action” be amended to be limited to the Federal Court of Canada, if you're wedded to a private right of action to begin with. The problem with the existing legislation is that anybody can go to the Federal Court of Canada or a provincial court. We know that there are going to be hundreds of people affected over the next decade or so, with respect to particular incidents. You're going to end up with duplicative proceedings simultaneously across the country. We already know that judicial resources are significantly taxed.
I think legitimate purposes—which are largely based on the European model—need to be more closely aligned. I'm happy to provide more details on what is happening in Europe.
With respect to the artificial intelligence and data act, it should be its own bill and subject to its own study. I would note that excluding the government from it is dangerous. The government has guns. The government decides about benefits, immigration and things like that. I think it's subject to a constitutional challenge. It's not necessarily harmonized with what's going on with our international trading partners, and there should be reciprocal recognition.
If a company is complying with European data regulation and we have deemed it to be substantially similar, that should work. Otherwise, we're going to have difficulty with Canadian businesses operating internationally and international businesses coming here.
Finally, I think research and development should be removed from the bill, because it presents no real risk of harm to an individual until it's presented into the public.
I have a longer list. I could go on for much more than five minutes, but I think that's my time. I look forward to the discussion.
:
Thank you for inviting me.
I'm pleased to be here today to share my thoughts on Bill .
I am a partner at Borden Ladner Gervais and the leader of the firm's national privacy and data protection practice. Having worked in the field for more than two decades, I provide advice to large national companies in a number of industries across the private sector. Many of these companies have international operations as well, so I have followed the developments in the European Union's General Data Protection Regulation, or GDPR, in recent years. The GDPR is, of course, the EU's equivalent to our privacy legislation.
I believe this privacy reform process should draw on the lessons learned by Quebec and the European Union in reforming their privacy legislation.
I am here today as an individual. I'm going to switch to English now, but I would be happy to answer members' questions in English or French.
[English]
Today I stand before you to discuss a matter of paramount importance, the reform of the federal privacy law.
We find ourselves at a critical juncture. We have the unique opportunity to strike a balance that ensures the protection of our privacy rights while fostering an environment of innovation. In a rapidly evolving digital age, where information flows faster than ever before, our privacy is at an increased risk. This makes it imperative that we reform our privacy laws to reflect the realities of today.
However, data protection laws should not stifle the innovative spirit that has propelled us into the 21st century. Canada needs to remain competitive. Innovation drives economic growth, creates jobs and improves our quality of life. It is the engine of progress. Striking the right balance between privacy and innovation is a complex task, but I don't think it's an impossible one.
I'll focus my presentation on the consumer privacy protection act and areas of improvement for four specific issues that potentially impact innovation.
First, I absolutely welcome the introduction of a consent exception regarding specified business activities and for certain activities in which the organization has “legitimate interest” under subclause 18(3). This being said, the legitimate interest exception is actually narrower than the same exception under the EU's GDPR, the General Data Protection Regulation.
David raised this issue, so I'm going to talk a bit more about it.
Bill provides no exception, nor any significant flexibility, as to the application of the consent rule to the collection of personal information collected from publicly available sources on the Internet. It prevents all organizations from leveraging data available on the web, including legitimate ones working on new products and services that may benefit society and that need a large volume of information.
In short, I submit to you that this legitimate interest exception should be more closely aligned with the GDPR legitimate interest legal basis to accommodate innovative types of business models while protecting the privacy interests of Canadians.
Clause 39 creates a new consent exception for disclosures of de-identified personal information to specific public sector entities, including government, health care and post-secondary educational institutions. Limiting this consent exception only to disclosures to public sector entities instead of public and private sector entities severely restricts its utility. Clause 39 should authorize and facilitate responsible data sharing between a broader range of actors to have access to talent and resources that they can leverage to pursue socially beneficial purposes.
The third point is that the CPPA introduces new definitions for the terms “anonymize” and “de-identify” and provides greater flexibility regarding the processing of these categories of information. However, the proposed standard for anonymization under subclause 2(1) is more stringent than other recently updated privacy legislation, including the GDPR and the recently amended Quebec private sector act.
My point is that the CPPA should include a reasonableness standard instead of holding organizations accountable to an absolute standard that may be impossible to meet in practice. As you certainly know, access to to anonymized datasets, with legal certainty, is crucial to research and development performed by Canadian organizations. I have a feeling that Adam Kardash and Khaled El Emam will be talking about this a bit more.
My last point is that clause 21 introduces a new consent exception for the use of de-identified information for internal research, analysis and development purposes.
Restricting such use to internal uses may limit collaboration and the fostering of research partnerships, preventing stakeholders from sharing datasets to create data pools that are broad enough for the production of useful and actionable insights. This section should authorize the use and sharing of de-identified information among different organizations.
I've submitted a short brief in French and English in which I provide additional detail on these four proposed changes. I think innovation and privacy can coexist, and the responsible use of personal information can be the cornerstone of building new and exciting technologies while respecting our fundamental rights.
Thank you, and I welcome questions.
Thank you, committee members, for inviting me to participate in your study.
I am here as an individual, but my experience as the federal privacy commissioner from 2014 to 2022 will certainly be reflected in my remarks.
To begin, let me say I agree with my successor, Philippe Dufresne, that the bill before you is a step in the right direction, but that it is necessary to go further in order to properly protect Canadians. I also agree with the Office of the Privacy Commissioner's 15 recommendations for amending Bill , with some nuances on audits, remedies and appeals. The government has taken up, at least in part, a good number of the recommendations I had made regarding Bill , the predecessor to Bill C‑27. Among those that were not accepted is the application of privacy law to political parties.
I am very pleased that a consensus appears to have emerged among political parties to recognize in the law that privacy is a fundamental right. I applaud parliamentarians for that decision. The question now becomes how to best translate into law the principle with which you now all agree.
[English]
suggests amending the preamble and the purpose clause of the CPPA. These are steps in the right direction, but they are not sufficient. You should also amend two operative clauses: proposed section 12 of the act on “appropriate purposes”, and proposed section 94, which provides for administrative monetary penalties for certain violations of the law. Without these amendments, the law would still give greater weight to commercial interests than to privacy, which is a fundamental right. This does not appear to be your intent.
Based on my reading of parliamentary debates, it also seems to me there's consensus around the idea that privacy and economic growth through innovation are not in a zero-sum game. The question is generally not on deciding which should prevail—privacy protection or innovation—as both can and should be pursued at the same time. It is only in rare cases that it will not be possible. In those cases, privacy as a fundamental right should take precedence.
Proposed section 12 of the CPPA does not, in my view, faithfully translate this consensus. Rather, it upholds the traditional approach, which is that privacy and economic goals are conflicting interests that must be balanced without considering that privacy is a fundamental right. This may have made sense under the current act's purpose clause, but it will no longer make sense if the CPPA's purpose clause recognizes privacy as a fundamental right, as is currently proposed.
Proposed section 12 is central to the exercise that commercial organizations, the Privacy Commissioner and ultimately the courts will have to go through in order to determine the factual context of each case and the weight given to privacy and commercial interests.
[Translation]
Section 12 as drafted gives more weight to economic interests. It does that in several ways.
The first is through the terminology it uses. It refers to “business needs” and does not refer to privacy as a right, fundamental or otherwise.
When the proposed section does refer to privacy, in paragraphs (2)(d) and (e), it is as an element to consider in achieving business goals, mitigating losses where possible, that is where achieving business goals can be achieved at comparable cost and with comparable benefits.
Nowhere is it mentioned that privacy protection is an objective at least equally as important as economic goals. On the contrary, the focus is on economic goals, and privacy loss as something to be mitigated, where possible, in the pursuit of those goals.
I have provided you with my proposals for amending section 12, and they would be consistent with the amendments proposed at section 5.
With respect to sanctions, all violations of section 12, including the appropriate purposes clause at subsection (1), should potentially lead to administrative monetary penalties. Without sanctions, recognizing privacy as a fundamental right would be a pious wish, without real consequences.
I would go further and recommend that all violations of the CPPA should be subject to these penalties. This would align Canada with most other jurisdictions.
[English]
I have a few words on the Artificial Intelligence and Data Act. That part of Bill is brief, even skeletal, and leaves a lot of room for regulations. While I understand why some are concerned with this, I think this approach is defensible, given the fact that AI technology is relatively nascent and is certainly evolving very quickly; however, the lack of precision in AIDA, in my opinion, requires that certain fundamental principles and values be recognized in the act itself. First and foremost, the act should recognize the importance of protecting fundamental rights, including the right to privacy, in the development and implementation of AI systems.
Finally, some of you expressed concerns in an earlier meeting with the difficulty of detecting violations of the law and the potential value of proactive audits to facilitate detection. As commissioner, I had recommended proactive audits, and I still believe they are a necessary part of an effective enforcement regime. This is particularly true in the case of AI.
Thank you. I would be pleased to take your questions later.
:
Thank you. Good afternoon, everyone.
My name is Adam Kardash. I'm chair of Osler, Hoskin and Harcourt's national privacy law and data management practice, and I've been practising exclusively in the privacy area for more than 20 years.
I'm pleased to be before INDU on behalf of CANON, the Canadian Anonymization Network, which is a not-for-profit organization whose members comprise large data custodians from across the public, private and health sectors.
I'm joined this afternoon by Khaled El Emam, a Canada research chair in medical AI at the University of Ottawa and the leading global expert on anonymization and de-identification technologies and methods.
As you are aware, Bill introduces definitions of anonymized data and de-identified data within the text of the proposed consumer privacy protection act. The concept of anonymized data is a core feature of the CPPA, as it clarifies the scope of application of the CPPA's privacy legislative scheme.
There are several very important provisions throughout the CPPA related to the terms de-identification and anonymization. It is therefore essential that the CPPA provisions relating to these terms—anonymized and de-identified data—be carefully considered and appropriately articulated within the CPPA's legislative scheme.
In August of 2022, CANON struck a working group to conduct a thorough legal consideration of Bill , and we received comments from stakeholders across all sectors as part of a consultation process, including a workshop attended by over 100 participants.
CANON is proposing surgical revisions that provide critical clarifications to several provisions within the CPPA, including to the provision referenced by my colleague Éloïse Gratton for proposed section 39. We're proposing additional privacy protections to disclosures without consent for socially beneficial purposes. The details of our submissions are contained within the written submission we submitted to INDU.
Our most important recommendation relates to the CPPA's current definition of “anonymize”. The current definition provides that personal information would be anonymized only if it is “irreversibly and permanently” modified in accordance with “generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly....”
We are proposing an amendment as a surgical addition to this definition, as the current text of the definition of “anonymize” sets an extremely high and practically unworkable threshold for the circumstances in which information would no longer be deemed to be identifiable. Specifically, anonymized data within the CPPA does not incorporate the concept of reasonably foreseeable risk in the circumstances and therefore is not consistent with the standard for anonymization within legislative schemes across the country, including Quebec's Law 25, Ontario's Personal Health Information Protection Act, and multiple other statutes cited in our submission. We have everyone. There are at least 12 that we've cited in the statutes for your consideration when you're reviewing our brief.
To be clear, and this is critically important, there is a very high legal standard for anonymization right now in Law 25, under PHIPA and under all these other statutory frameworks. It's very high, but unlike the CPPA, the anonymization standard in these other legislative schemes is practically workable. The reason is that it expressly contemplates contextual risk.
As a result of these concerns, CANON has proposed an amendment to the CPPA's definition of “anonymize” that simply incorporates the concept of reasonably foreseeable risk in the circumstances into the definition. Our proposed surgical amendment would align the CPPA's concept of anonymized data and, critically, ensure the interoperability of the CPPA with the standard for anonymization within other legislative schemes across Canadian jurisdictions. Our proposal is fully consistent with well-established Canadian jurisprudence on the scope of the concept of personal information, the citations for which we provide in our submission.
I'm going to turn my comments over now to Khaled El Emam to conclude our introductory remarks.
I want to use my time today to highlight the practical importance of CANON's proposals to the definition of “anonymize”.
My comments today are based on my experience with anonymization over the last two decades, both in the context of research and applications and of practice. A core focus of my work has been on the anonymization of health data such that it can be used and disclosed for research purposes, which includes developing new treatments and devices to help patients.
In my view, the CPPA's current definition of “anonymize” most often will not work well in practice when interpreted literally. It risks setting an unachievable standard that in practice is not necessary for good privacy protection. The text needs to reflect the reality that the outcome of anonymization is not absolute. It is well established among anonymization and data de-identification experts that data anonymization is a process of risk management. This is a foundational element of the recently published ISO international standard for data de-identification. Good contemporary practices, when implemented properly, can ensure that the re-identification risk is very small. Very small re-identification risk can be precisely defined and has been precisely defined by organizations such as Health Canada.
Effective re-identification risk management involves using techniques and technology to modify data as well as the implementation of appropriate administrative and technical controls. The combination of modified data that has been wrapped with appropriate administrative and technical controls ensures that the re-identification risk can be made very small.
This concept of risk management will not ensure that the re-identification risk is zero or that anonymized data is absolutely irreversible. That is not a practical standard that can be met. This is why it's important to amend the current definition of the term “anonymize”, which currently implies zero risk.
Our proposal supports the important and necessary requirement currently within the CPPA's definition that generally accepted best practices are followed during the process of anonymization, but the CANON proposal adds the concept of reasonably foreseeable risk and the circumstances so that the definition is actually workable in practice.
Based on my years of developing and implementing anonymization methods and technology, on behalf of CANON I think the implementation of CANON's proposals will enable a more responsible use and disclosure of data compared to the current definition.
We thank you in advance for your consideration. We would be pleased to answer any questions you may have.
:
Thank you, Mr. Chair. Thank you, witnesses.
My first series of questions are to Mr. Therrien.
You were the Privacy Commissioner during the development of the replacement for the Privacy Act in the last Parliament, Bill , and presumably in the run-up to the development of this one. The current Privacy Commissioner was here last week and said essentially that he personally wasn't the commissioner who was consulted on it.
This is a critical bill because it's a complete replacement of the Privacy Act. It's not an amendment.
I'll start by asking you if, in the development of Bill , the Minister of Industry of the day—I believe it was Mr. Bains—consulted with you before the bill was tabled in Parliament.
:
I would say that proposed sections 12, 15 and 18 are critical on the privacy part. I agree that AIDA is a blank slate, and we'll come to that another time—hopefully today.
Proposed section 12 sets out the purpose. Proposed section 15 talks about express consent and then, in proposed subsection 15(5), says that it's okay to use “implied consent”. Then proposed section 18 says that a business has “legitimate interest” to use an individual's data basically however it wants, even if it harms the individual.
To me, it places the emphasis. When you take proposed section 5 and then add proposed sections 12, 15 and 18 to it, it looks like big business and its right to use your data is being protected in this, even if it harms you.
Do you not need to amend all of those proposed sections, not just proposed sections 5 and 12?
:
I should probably start by clarifying that statement.
One of the underlying principles of privacy is that individuals retain control over their personal information. That idea goes back to the early 1970s, before the Internet came along. Things have obviously changed since then. Today, we are dealing with huge amounts of information and complex business models, not to mention partnerships. On top of that, privacy policies are very long, complex and detailed to ensure that individuals have all the information. However, they don't take the time to read all that information because it's so complex and burdensome.
Keeping that in mind, I think it's worthwhile to try to reduce the need for consent and to focus on situations that require the individual's consent, while introducing other legal grounds for protecting the individual, a bit like what Europe did with the GDPR. In that respect, with the exceptions to consent, I think the bill is definitely a step in the right direction.
Clearly, other safeguards are needed. For instance, in order for the legitimate interest exception to apply, the company has to document why it considers the collection or use of the information acceptable and carry out a risk assessment. There are safeguards. Companies have to do a bit more work to make sure that they are protecting individuals' right, and they are subject to penalties. Companies want to be compliant and good corporate citizens, of course, but they also want to avoid penalties. With the penalties, which are in line with what we see in Europe, the bill provides that incentive.
:
I'll start, if that's okay.
The Privacy Commissioner's office has been working so far as an ombudsman model, and it also has an advisory branch. That's quite useful.
This means that when there's an investigation, there's a conversation. There's a dialogue. In some cases, businesses can go knocking on their door and say: “Hey, what do you think about this business model? We want your input.” I'm just concerned that if there's a tribunal, will that relationship potentially be impacted? I guess that's concern number one.
My other concern is the fact that a lot of these privacy principles are quite flexible, and we need that in our privacy law. On the notion of consent, sometimes it's expressed and sometimes it's implied. It's subject to the reasonable expectation of the individual. Security measures have to be adequate in light of the content. There is so much in grey zones and uncertainty. Now it's in the law. It's no longer principles. Adding the tribunal is just perhaps a layer of risk for businesses that have to navigate with a lot of grey zones in the law.
:
I will try to be brief.
The goal of these provisions should provide quick and effective remedies for citizens. In no other jurisdiction that I know of is there a tribunal such as that proposed in this legislation. In all other privacy jurisdictions, the original decision-maker, including with the power to make orders and set fines, is the data protection authority that is the equivalent of the Office of the Privacy Commissioner.
I hear concerns about the difficulty for the OPC to work with different roles. That is not a problem in other jurisdictions. It is well known in law that it is possible for an administrative tribunal to have investigative, advisory and adjudicative functions. This needs to be managed and it can be managed. There is no problem there.
I think the tribunal will create delays and will simply be duplicative of the expert work of the Office of the Privacy Commissioner. Again, there is no precedent internationally for this.
:
I personally am fully in favour of the tribunal.
I think it's important to start the conversation with looking at the sheer quantum of the potential penalties for contravention of the act, which, comparatively speaking with any other statutory framework, is a mess. With larger corporations, it's hundreds of millions of dollars.
As Mr. Fraser mentioned in his opening remarks, it's absolutely imperative in a circumstance when you're introducing a regime with that level of penalty, which could be potentially impactful for businesses in every constituency here, that you just have a procedural fairness piece on that and that everyone agrees with that. This will add to that procedural fairness piece and it will allow for, in my view, an appropriate articulation of whatever the penalty is or should be in a particular circumstance.
:
It's a good question. These are technical terms, and they often cause confusion.
CANON was established to help demystify this terminology, because that ambiguity creates uncertainty and uncertainty creates reticence risk. It's an issue.
Simply put, de-identifying data is the removal of direct identifiers. The language is quite elegant within current language in the CPPA. When you remove direct identifiers, you still have indirect identifiers. In other words, the data is still potentially identifiable. De-identified data is still regulated by the statutory framework.
Anonymized data, which was the subject of my opening remarks, has a more exact definition that sets the standard for the application of the statute. I think it's really important to go through, given how technical these terms are. The current definition talks about irreversible and permanent modification in accordance with generally accepted best practices to ensure that an individual cannot be identified from the information, directly or indirectly.
Our view and the view supported by our extensive consultations and jurisdictional analysis, etc., is that it doesn't work. You need the contextual piece of the reasonably foreseeable risk in the circumstances, which is embedded in Law 25 and which is embedded in PHIPA. You'll see in the briefs that we provide you with these other regimes.
Anonymized data means there's no foreseeable risk, in the circumstances, to identify the individual.
:
The reason I say that is there is a case study we can use. Mr. Therrien knows well about this.
In the holiday season of 2021, Telus was selling data to the Public Health Agency of Canada. Canadians who went out during a lockdown to visit the pharmacy or went to the grocery store were tracked, and that data was sold to the Canadian government.
We did then talk about this in the ethics committee.
Mr. Therrien, you were very succinct in your comments. There were two parts to this. There was not implied consent. You noted, “While there is reference to 'data for good' programs somewhere in the Telus privacy policies, while the government does make an effort to inform citizens...I do not think anyone would seriously argue that most users knew how their data would be used.”
I'm trying to back this up. My real question is, does this act, with your amendments, fix that situation?
I'm going to ask Mr. Kardash that first.
Mr. Therrien, the question for you afterwards would be this: Does this act go far enough to address the consent model we're looking for if this were to ever happen again?
Mr. Kardash, I'll start with you.
:
The requirements for consent in the regime apply to personal information. It could be de-identified data, which is just the removal of direct identifiers. They don't apply to personal information where, as the statute is currently drafted, it's “reasonably foreseeable”.
The Office of the Privacy Commissioner of Canada did an excellent job in that investigation. I know it well; I acted in that investigation. In their careful analysis, they determined that the data that was received by the Public Health Agency of Canada was not identifiable in the context of the disclosures that took place. Therefore, if the data was not identifiable, it's not personal information. If it's not personal information, it wouldn't be subject to the consent requirements or to the statutory regime.
Our surgical amendments make no difference to that. In fact, it reflects the current law, etc.
Again, I can't overstate the exceptionally high standard for what is personal information right now. You have to look contextually at the circumstances and you have to look at the technical methods for de-identifying, which are wrapped in administrative controls, security controls and physical controls. That suite of controls was implemented on top of some very sophisticated methods to ensure that the Public Health Agency of Canada, as determined by the Office of the Privacy Commissioner of Canada, did not receive any identifiable data.
:
What I can say is that of the four recommendations I'm getting, three out of four are actually proposing that our law be more aligned with the laws of Europe or Quebec, which are actually more stringent.
It's also an issue with interoperability and making sure that our requirements are harmonized, especially when they make sense. We don't need to reinvent the wheel. If Quebec got it right, and if in Europe they got it right through the GDPR, why are we reinventing the wheel?
Perhaps one issue I'd like to raise is that in Europe they had interpreted their requirement to mean websites need cookie banners, and five years later they're reassessing that. There's a movement in Europe, the cookie pledge, and they are reassessing whether they are better protecting website users with these cookie banners, which are extremely complex. People are just accepting them.
I think maybe one lesson learned from Europe that we should not replicate here is pushing for website cookie banners.
Mr. Kardash, the Canadian Anonymization Network has a particularly interesting case. According to a paper you published in May 2023, the current definition of “anonymize” sets an extremely high and virtually unattainable threshold for circumstances under which it can be concluded that information can no longer be used to identify someone.
The document refers to Bill 25, adopted by the Quebec National Assembly in 2021. The latter uses more moderate language, to ensure that anonymization is achievable, and advocates the adoption of similar language in order to ensure interoperability between the two regimes.
In your opinion, if the language is left as it is in the current bill, what will be the implications for Quebec companies, particularly small and medium-sized businesses that will be subject to Bill 25, since it would take precedence, but also to this bill, if their operations cross the border?
I apologize for my leaving off and on during the meeting. There are world events that particularly complicate my riding in Windsor and the Detroit region. I apologize if I repeat something or miss something, but I will go back and listen to the rest of the stuff I've missed from the witnesses.
Mr. Therrien, I do want to ask a question about a certain situation. The Competition Bureau recently had to pay a fine for investigating the Shaw takeover by Rogers and opposing it; and it ruled against them. Through other testimony we learned it might be the same process that could happen here for the privacy commission in this legislation. We have to sort that out, because I was told something from one, and we had different testimony from another.
Again, with the tribunal, I know you have a little more to offer. On creating this type of a body, do you really think it could undermine the strength of the privacy commission in general? I worry about that, because I know that the United States doesn't have this model; but for ourselves, it has actually served Canadians quite well.
I'd like you to expand on the vulnerability if we change the route that we have right now.
I think there is a risk of the OPC being undermined—in the following way, at least. The federal office works—and has to, because data flows internationally and within provinces in Canada—with colleagues in Canada and internationally.
As I explained in my earlier answer, there is no other jurisdiction with the type of tribunal that is proposed federally under the CPPA. That would put the OPC in a situation such that when it conducts joint investigations with colleagues across Canada or internationally, its position would be effective later than that of its colleagues. The OPC would then have to wait for the blessing of the tribunal for an order to be upheld or for a penalty to be imposed. That's one thing.
However, even more importantly, Canadian citizens would have to wait longer than others in other jurisdictions, including in Quebec. The CAI in Quebec has order-making and fine-imposing powers. There's a difference in the rapidity with which Canadian citizens would have protection compared to other jurisdictions.
:
Okay. That answers my question. We don't know.
What I hear from you today is.... We're talking about consent as well. My concern is the consent of a child. We don't have a definitive age for a minor in this legislation.
Do any of you want to comment on whether we need to define the age of a child, and whether there should be different tiers of consent related to a minor in this legislation?
Again, it all goes back to my kid on an iPad, when I'm sitting on an airplane or driving somewhere. He's in the back and he's clicking on certain things. I don't know where that information's going, and I get scared about that. I'm fearful as a parent. I know every committee member around here has similar concerns.
How can we address that in this legislation to make sure that children are protected?
I just want to say thank you to the witnesses for appearing before the committee and for their useful testimony.
My questions are playing off what Mr. Vis said. I wanted to talk about socially beneficial purpose as well. My questions are for Mr. Kardash.
We know that according to proposed section 39 of the consumer privacy protection act, an organization has the right to disclose to certain entities de-identified personal information without the knowledge or the consent of the individual if the disclosure is made for a socially beneficial purpose. That is defined in the bill. It means “related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.”
Do you think that this definition of “socially beneficial purpose” is enough to protect the privacy interests of Canadians, in addition to the fact that this is already de-identified information, which, as you said in your opening testimony, is already a pretty high bar? It's an exacting standard.
:
Certainly I agree that it's two different models. We also have, for example, a human rights commission and a human rights tribunal, a competition commissioner and a competition tribunal. There are other scenarios in Canada in which that particular model is applicable. There is the possibility for conflicts, and one would have to have controls and procedural safeguards within the Office of the Privacy Commissioner of Canada to make sure that those conflicts did not arise.
Given the stakes that this legislation presents, with multi-million-dollar penalties, even multi-billion-dollar penalties when you look at percentage of a company's global turnover, it raises the requirement for additional procedural safeguards. You can think of it as a scenario in which a police officer can write a ticket, and you could pay the ticket and plead guilty and go on your way, or you can dispute it, and the police officer has the burden of proving in front of an impartial decision-maker whether or not the facts alleged in that ticket are borne out. That would be the model that I would advocate.
Otherwise, maybe we can split the difference, and when it comes to anything that has a significant penalty over a certain threshold, it would require those additional safeguards. Those are going to be important.
I would also note that we're seeing more and more multi-jurisdictional investigations taking place simultaneously, so organizations are going to be subject to multiple penalties in multiple places arising from the exact same investigation. The fine threshold in Quebec is similar to the fine threshold here. You could find those to be doubly levelled, which again, at least to me, raises the stakes higher.
:
I guess I would start with wondering whether you think that we should have a world without Facebook, ChatGPT and things like that.
In my view, this legislation takes what we have existing in PIPEDA and largely, as I said, turns it up to 11, so it puts a greater requirement of diligence on the part of organizations in order to, for example, justify their decision-making, document risks and do those sorts of things, and then it has those substantial penalties.
Had this been implemented 10 years ago, I'm not sure that the universe would be all that different, because I think it's still based on the 10 principles from the Canadian Standards Association's code for the protection of personal information, which are very Canadian principles with respect to privacy.
I am very curious to hear from Mr. Therrien in terms of how he thinks it would have been different had he entered office with the CPPA at his disposal.
:
I think that the CPPA brings us much closer to where we ought to be in 2023. With the new implementation of artificial intelligence, part 2 of Bill is an attempt to align Canada's legislation to that new technology.
There's no perfect solution in all of these situations. There are people who think that the artificial intelligence act is so skeletal as to be meaningless, and there's some merit to this. I think it's okay for where we are today.
One virtue of the legislation before you is that it continues with the consent model in many circumstances in which consent can possibly be given, but it also recognizes that there are important limits to the consent model, such as legitimate interests and socially beneficial purposes, but I think the missing piece is that these additional flexibilities that reflect the current use of technology have to be implemented within a rights protection framework.
Although the minister's latest amendments bring us a bit closer, we are still quite a way from where we ought to be, and that is why I recommended that proposed sections 12 and 94 on penalties, particularly on penalties, are important, because what's the value of having a recognition of privacy as a fundamental right if there is no penalty when you breach that principle?
:
In brief, my point there is that it is extremely difficult, if not impossible, for individual consumers to understand how their data is used. It is even difficult for the regulator to understand how data is used.
How will violations be identified if we rely mostly on individual consumers to make complaints? There are provisions, I know, for commissioner-initiated complaints, but the model we have is premised mostly on the basis that individual consumers will complain.
In many situations, they don't know there's a violation. Proactive audits exist in other jurisdictions I've mentioned in my document, whereby the regulator can audit the practices of a company, not because there is belief that there's been a violation already but simply to reassure consumers that this new practice actually does comply with the law and therefore, yes, you can have confidence that it is privacy-protected, or no, it is not, and then the company will have to amend its practices.
I think proactivity is extremely important.
:
Mr. Therrien is reluctant to speak because this involves the division of powers. Many of these issues fall under provincial jurisdiction. In Quebec, for example, the Civil Code governs the rights of the child.
I don't think those laws need to go beyond dealing with consent and protecting the data of children held by private sector organizations. That's really what these laws are designed to protect.
A little earlier, we talked about the age of consent. The bill could be more specific in some respects about the type of consent of the child, depending on their age. In Quebec, that distinction is made, but, again, in the rest of the world, it often varies. There is the age of majority and there are young children. Between the two, there are young people between the ages of 13 and 18 or 19, the age of majority.
In Quebec, the age of consent has been set at 14. This creates a lot of operational problems for organizations that want to put safeguards and measures in place to protect children. We should just keep that in mind.
My view—and we've thought about this quite carefully—is that there is no public policy rationale for the political parties' processing of personal information not to be subject to a privacy legislative regime. The only question that I think is open is what the appropriate instrument would be and whether that would that go into the CPPA. I think there's some validity to the proposition that it might be a separate instrument. My personal view is that it was something that was missing in Bill . It could have been in there.
Right now, if you compare the privacy protections that are set out in Bill under the CPPA to the current protections afforded to individuals in respect to the processing of personal information by political parties, you see that they're not even in the same universe. You would just have to post a privacy statement. There's no security breach notification requirement. There are no access rights and no consent rules. It goes on and on. There are no rights of express redress. There's no independent ombudsman who would oversee and take complaints, investigate, etc.
I think this is something that is incredibly important and I'm very thankful to you, Mr. Masse, for bringing that up.