:
I call this meeting to order.
Good afternoon, everyone.
Welcome to meeting number 90 of the House of Commons Standing Committee on Industry and Technology. Today’s meeting is taking place in a hybrid format, pursuant to the Standing Orders.
Pursuant to the order of reference of Monday, April 24, 2023, the committee is resuming consideration of Bill .
I’d like to welcome our witnesses today, from the Office of the Privacy Commissioner of Canada. First, we are hearing from Philippe Dufresne, Privacy Commissioner of Canada.
Thank you for joining us again today.
Next, we have Lara Ives, executive director, Policy, Research and Parliamentary Affairs Directorate, as well as Michael Maguire, director, Personal Information Protection and Electronic Documents Act, Compliance Directorate.
I thank all three of you for coming back. I'm confident that everything will go well today—I'm looking at my colleagues—and that we'll have a chance to have a normal meeting and benefit from your insights on Bill C‑27.
Without further ado, Mr. Dufresne, I'll give you the floor for five minutes.
Ladies and gentlemen members of the committee, I am pleased to be back to assist the committee in its study of Bill , which would enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act.
When I previously appeared before the committee three weeks ago, I delivered opening remarks about the bill and presented my 15 key recommendations to improve and strengthen the bill. Today, I want to briefly highlight and respond to the letter the sent to the committee on October 3, 2023, and to answer any questions that you may still have.
[English]
I welcome the minister's stated position on the amendments being developed with respect to the proposed CPPA, in which he seems prepared to agree with four of my office's 15 key recommendations, namely by explicitly recognizing privacy as a fundamental right; by strengthening the protection of children's privacy; by providing more flexibility for my office to use compliance agreements, including through the use of financial penalties; and by allowing greater co-operation between regulators.
I also note and commend his statement of openness to further amendments following the study by this committee.
I would like to take this opportunity to highlight other ways in which the bill should be strengthened and improved in order to better protect the fundamental privacy rights of Canadians, which are addressed in our remaining recommendations to the committee.
I will briefly highlight five of our recommendations that stand out in particular in light of the minister's letter, and I would be happy to speak to all of our recommendations in the discussion that will follow.
First, privacy impact assessments, PIAs, should be legally required for high-risk activities, including AI and generative AI. This is critically important in the case of AI systems that could be making decisions that have major impacts on Canadians, including whether they get a job offer, qualify for a loan, pay a higher insurance premium or are suspected of suspicious or unlawful behaviour.
While AIDA would require those responsible for AI systems to assess and mitigate the risks of harm of high-impact AI systems, the definition of harm in the bill does not include privacy. This means that there would be proactive risk assessments for non-privacy harms but not for privacy harms. This is a significant gap, given that in a recent OECD report on generative AI, threats to privacy were among the top three generative AI risks recognized by G7 members.
In my view, responsible AI must start with strong privacy protections, and this includes privacy impact assessments.
[Translation]
Second, Bill C‑27 does not allow for fines for violations of the appropriate purposes provisions, which require organizations to only collect, use and disclose personal information in a manner and for purposes that a reasonable person would consider appropriate in the circumstances. This approach would leave the federal private sector privacy law as a standout when compared with the European Union and the Quebec regime, which allow the imposition of fines for such important privacy violations.
If the goal is, as the minister has indicated, to have a privacy law that includes tangible and effective tools to encourage compliance and to respond to major violations of the law in appropriate circumstances—an objective I agree with—I think this shortcoming surely needs to be addressed for such a critical provision.
[English]
Third, there remains the proposed addition of a new tribunal, which would become a fourth layer of review in the complaints process. As indicated in our submission to the committee, this would make the process longer and more expensive than the common models used internationally and in the provinces.
This is why we've recommended two options to resolve this problem. The first would be to have decisions of the proposed tribunal reviewed directly by the Federal Court of Appeal, and the second would be to provide my office with the authority to issue fines and to have our decisions reviewable by the Federal Court without the need to create a new tribunal, which is the model that we most commonly see in other comparable jurisdictions.
Fourth, the bill as drafted continues to allow the government to make exceptions to the law by way of regulations, without the need to demonstrate that those exceptions are necessary. This needs to be corrected as it provides too much uncertainty for industry and for Canadians, and it could significantly reduce privacy protections without parliamentary oversight.
[Translation]
Fifth, and finally, the bill would limit the requirement for organizations to explain, upon request, the predictions, recommendations or decisions that are being made about Canadians using AI, to situations that have a significant impact on an individual. At this crucial time in the development of AI, and given the privacy risks that have been recognized by the G7 and around the world, I would recommend more transparency in this area rather than less.
With that, I would be happy to answer any questions that you may have.
:
Thank you, Mr. Chair, and thank you, Commissioner.
The protection and safeguarding of an individual's personal information in the digital world, and the artificial intelligence world we're evolving into, in my view must be protected from the abuse of businesses and what they may intentionally or unintentionally do with that information.
After eight years, this new Liberal privacy bill, which is flawed, introduced 18 months ago, sat for a year in the House before it was brought for debate. You are Canada's Privacy Commissioner, the guardian of privacy for individuals in this country. Did the Liberal government consult and involve you in the development of this bill before it was introduced in June 2022?
:
It should be. This is why I've recommended this explicit recognition.
As you know, up until now it was described sometimes as a privacy interest or as a right—there was some more tepid language, I suppose—and my strong recommendation was that we need to make this explicit. We need to recognize it is quasi-constitutional, as courts have said and as the international community has said, so that in the purpose clause—and I recommended adding it in the preamble as well as in the purpose clause, but you're right; the purpose clause is the key—if you use the words “fundamental right”, you are sending a signal to courts, to decision-makers, to me, that even when you are balancing this with other elements, such as the needs of organizations—which have to be considered; we have to have innovation at the same time—if there is a clear conflict, one should prevail, and it is the fundamental right that should prevail. This is why it's so important this is enshrined in the law.
I was encouraged by the statement of the that this is now the intent. It's certainly something I've been advocating for since day one.
I have a list, in fact. The minister said he wanted to implement four of the recommendations, which leaves 11 of the 15 we put forward. Of those, I mentioned five, in particular.
The first recommendation is requiring PIAs for new technologies that can significantly impact Canadians, like generative AI. In my eyes, that is a major gap in the bill. PIAs are required for other types of harm and bias, but not for privacy harms. That seems contradictory, since it goes against an OECD finding: threats to privacy are the top third risk. Privacy absolutely has to be prioritized.
The second recommendation is requiring organizations to be more transparent about decisions that are made using AI. As it stands, the bill sets out the right to an explanation, which exists in other regimes. That right, however, is limited to decisions that significantly impact people. I recommend removing that proviso, so that people have the right to transparency and an explanation whenever a decision about them is made, no matter how great the impact.
People in the AI world are worried. We are hearing that more and more. They need reassurance. There are huge benefits to AI. Personally, I think more transparency will help people understand what AI is and what it isn't, show them that they are protected by a robust privacy regime.
The third recommendation revolves around administrative monetary penalties. They are used only as a last resort. I'm not saying this because I want to see them used—I hope that won't be necessary—but I would like those penalties to incentivize decision-makers to make good decisions. There is a gap, though. Currently, one of the biggest violations in the bill is not subject to an administrative monetary penalty. I'm talking about contravening the provisions on legitimate business purposes. I think this is a major consideration.
The fourth recommendation deals with the broad regulatory authority being given to the government, specifically the ability to make exceptions to the act without having to demonstrate that those exceptions are necessary. That is overly broad, in my eyes. A provision in the bill even allows the government to make regulations to completely exclude an activity from the application of the act. That goes too far and must be rectified.
The fifth and final recommendation proposes the creation of a tribunal as another layer of review. This would lead to a longer more expensive process and require the creation of a new structure. This diverges from the regimes in Quebec, Europe and other jurisdictions. Here's what I recommend: if a tribunal is set up, its decisions should be reviewed directly by the court of appeal. That would add a layer of review while removing another. The other option is to follow other models by giving my office the authority to issue fines and making those decisions reviewable by the usual court, as is the case in most regimes.
:
Currently, the bill gives my office the authority to issue orders.
Most cases would probably involve the use of orders instead of administrative monetary penalties. The bill does a good job of prescribing the use of penalties. It lists the factors that must be taken into account, including the organization's approach and diligence, and whether it complied with a certification program. Whether the organization acted in good faith really matters, as do the efforts it made.
My office has the authority to issue orders, which is extremely important. I think the penalties are high enough, but with the use of orders, it's possible to put a halt to the activity and the collection of the information. Both of those are very important. Persuasion and negotiation are also tools, of course. That's why we recommended the use of compliance agreements, something the minister agreed to. That ability is also very important.
My preferred approach is to use dialogue and to encourage organizations to make the right decisions before they go astray.
:
This is my problem with the government on this. I'm not even sure if they're serious about this bill anymore. We have a hard time getting in amendments. There's the drama that went around that. Maybe we'll get the amendments from your department tomorrow, if there's compliance. That's a thing that I asked directly. Hopefully we can have our own researchers and analysts test that out.
It almost makes it a moot point in many respects for the Competition Bureau and an independent public entity to be able to challenge the conglomerates and powers that be to go forward. In fact, the $9 million is a drop in the bucket, if that was for Rogers. It's a squeeze on the Competition Bureau and clearly sends a chill down the spine of basically anybody who's interested in consumer rights in Canada. You can basically be bullied into the corner by a legal process.
That's good to know. I had some reservations about the tribunal to begin with. If that's the case, then this is much more abhorrent.
With regard to your 15 recommendations, could we walk through the five that the has agreed to in terms of those recommendations? I want to make it clear for those who are here. Can you identify which 15 in what you've submitted here today are the ones the minister has agreed to?
:
Those would be the top priority, starting with the notion of a privacy impact assessment for generative AI. To me, that is a major shortcoming.
If you look at AIDA and if you look at the 's proposed amendments to AIDA, you see a lot of discussion about risk mitigation, identifying risk and managing risk. This is absolutely essential and critical. However, we need to do this for privacy as well as for non-privacy harms. I'm very much insisting on this.
The other important recommendation, which I would say is the top priority, is making sure that fines are available for violation of the “appropriate purposes” provision. This is a violation of section 12. This is the key central provision. This is at the heart of the bill in a way, but there are no fines for that. That, in my view, should be corrected. It's easily corrected by adding that to the list of the breaches.
Other comparable legislation, like Quebec's, for instance, simply says, “a violation of the law”. The whole law is there. It's all covered. This approach lists offences, and then in Bill there were more omissions. It's been corrected to some extent, but it needs to be corrected further.
I talked about algorithmic transparency. It is an important element, especially at this time in AI. Again, we can manage that by providing guidance to industry, so it's something that's workable, but I think Canadians need to understand what is going on with their data and how decisions are made about them. If we limit it to matters that have significant impact, we're creating debates and limiting the transparency that Canadians deserve.
That is—
Before going to Mr. Williams, if you will allow it, colleagues, I'll grant myself a minute for one quick question to Mr. Dufresne.
Some hon. members: Agreed.
The Chair: There is unanimous consent.
[Translation]
I have a question for you, Mr. Dufresne.
If you were playing devil's advocate, what would you say is the best argument for a tribunal?
:
One of the concerns that's been expressed publicly, as I understand it, is that too much responsibility or authority is going to a single body, in other words, my office. Since the bill provides the authority to issue orders and significant fines, more procedural fairness may be warranted.
To address that concern, the government could say, yes, more procedural fairness is needed. That's the model used in Quebec and other parts of the world. You can't have the same process in a regime that includes fines and orders. The way the system works now, an investigation takes place and it culminates in recommendations. The level of procedural fairness isn't the same as that provided for in the bill.
Furthermore, the bill gives my office a new tool, the ability to conduct an inquiry. This tool ensures that procedural fairness and gives the parties an opportunity to be heard. It's something that exists in Quebec, British Columbia, Europe, Great Britain and France. The idea is that the commissioner can conduct somewhat of a more informal investigation at first, but once an order or a fine is issued, it becomes more formal and it moves up to the next level. That's where the procedural fairness comes in.
To my mind, following that model and allowing decisions to be reviewed directly by the Federal Court of Appeal wouldn't be an issue. The Supreme Court has recognized that an administrative decision-maker can have multiple roles. Obviously, it has to be managed properly.
That's my answer. I think the issue is the concentration of responsibilities or authority in one place.
:
Thank you, Mr. Chair, and thank you, Privacy Commissioner.
I know there have been a lot of comments today about what the said and what's forthcoming. I want to make this very clear to those listening at home: This bill is very important because, for the first time in 20-some years, we're dealing with having the largest amount of data that individuals, including our children, have ever had out in the open. We're dealing with data and, of course, in the second section, with AI. It's not up to the minister to approve certain amendments or decide what he wants to give us. It's up to this committee and then the House of Commons to determine how this bill, if adequate, will go forth to protect Canadians. I want to make that very clear.
We feel that, as the bill is presented right now, this government has not taken privacy seriously. It has not listed privacy as a fundamental right in the “purpose” statement of this bill, which other subregions of the country, like Quebec, already do.
I want to speak today on a certain portion of this bill that already gives more power to business than it does to individuals. It's a section that I think you identified, called “legitimate interest”.
Commissioner, I'd like you to define “legitimate interest” in your own words for the public and for people listening. I know you have a legal background. It's your fourth recommendation. I want you to explain how this drafted bill continues to allow the government to make exceptions to the law by way of regulations, without the need to demonstrate that those exceptions are necessary.
:
The bill provides for some exceptions to the ordinary obligation to have consent and knowledge. It provides exceptions in situations linked to and necessary for business operations. There is a carve-out for activities that should not be for the purpose of influencing individuals. The bill recognizes there may be some instances in which businesses would need the information and it's not practical to obtain consent or advise individuals of it. The condition for this is that a reasonable person would expect the collection or use for such an activity.
I have a concern where the bill provides a list of activities that could be considered business activities in the act. Some of them are.... The first one is “an activity that is necessary to provide a product or service that the individual has requested from the organization”. There is that element of necessity. The second example is “an activity that is necessary for the organization's information, system or network security”. Again, necessity is there. The third is “an activity that is necessary for the safety of a product” or for the organization. This element of necessity is crucial, because that's what justifies the fact that you're going to get consent.
However, the fourth—this is at paragraph 18(2)(d)—says, “any other prescribed activity.” It means that the government can add anything in there without a requirement of necessity.
My recommendation is that this be limited by saying, “any other prescribed necessary activity”, or by making it clear that the government is always limited by that necessity test.
:
Thank you to Mr. Dufresne and his team for being here. I appreciate your testimony today. I always found you, earlier when I was on PROC and had the chance to work with you and your office, to be very helpful and very good at communicating. I thank you for that. I appreciate your expertise. You bring a lot to this conversation that's very important.
We want to strengthen this bill and to continue to see it get stronger throughout, hopefully, what will be a collaborative working relationship for all of us.
Certainly, one of the top concerns I have is the rights of children. I know that you've spoken to this and written about this. Many children today, as we know, are immersed in the digital world. I can speak from experience. My 11-year-old daughter hides devices and is on many apps and downloading things. There are pop-ups, and she sometimes purchases things. There's data being gathered about her preferences, and this really concerns me. I think it concerns a lot of Canadians with regard to children's data and protecting their right to privacy.
Are there currently laws in Canada that help protect children's privacy online? That's the first question, and I'll have another one about this in just a second.
:
On the current privacy legislation, we've provided guidance in terms of how to obtain meaningful consent. In that, we talk about some things that should be looked at in the context of children.
My federal, provincial and territorial colleagues and I have recently issued a statement, a resolution, on protecting young persons' privacy. It gives examples of things that should be done and should not be done with the data of children, nudges them not to make a bad decision, and recognizes that they are more vulnerable.
We can interpret the law, to some extent, to protect children, but we need to do more, and this bill has started to do so. There is now a recognition in the initial version of the bill, and I give credit to on that. This was in the original Bill as tabled—the recognition that the information of minors would be deemed to be sensitive information. That has impacts in a number of areas, in terms of disposal rights, and so on.
We took that, and in our recommendations, we recommended going even further than that to highlight the best interest of the child in the preamble of the bill, so that if there is doubt, in terms of interpretation, you can look at that. The has signalled his agreement with that and has suggested going further to include the special situation of children in proposed section 12 on interpreting appropriate purposes. That is a further improvement that I would certainly support. We see comments like that in the European context with recital 38 to the GDPR, highlighting that children deserve special protection. UNICEF has said that.
We know that our kids are digital citizens. They're spending time online for all aspects of their lives, including school. We certainly saw it more during the pandemic. It's important that the legislation protects them appropriately and protects them as children. We need to protect the best interests of the child. We need children to be able to be children in that world, to be protected, and not to suffer consequences later on, when they're adults, for things they have done online. There are improvements there, and I certainly support them.
:
The sensitivity of information is an element that's going to be considered in the bill. It's going to be considered. I have to consider that in how I conduct myself. The sensitivity of information impacts the form of consent, impacts the retention periods and impacts the security safeguards. Certainly, it's something to be considered, and it should be relevant.
The proposal by the , which I support, to add the special situation of children to the appropriate purposes clause, proposed section 12, is important. I would add to that my recommendation today to make sure that if you breach the proposed section 12, and if you breach the appropriate purposes, including by treating children's information inappropriately, there would be fines available for that.
Beyond that, with respect to children, I'm not suggesting more than what has been suggested in the recent proposals by the .
:
Bill needs to prevent that to the same extent that the current law prevents that. As you know, this was a matter that my office investigated. We made findings that Tim Hortons had breached privacy law by collecting more information than it needed and by not being transparent about what was being done with that information.
We see these situations, and we've made some recommendations and some findings. Bill will help more than the current law, because it will provide for more explicit obligations in terms of explaining consent—making consent something that is explained in a meaningful way for individuals to understand. There is also the possibility that my office can issue orders, and there is the possibility of fines.
I believe that, in the Tim Hortons situation, the organization followed the recommendations. In the Home Depot decision that I issued last year, finding a breach of privacy, the organization agreed with the recommendation. However, that's not always going to be the case, so there need to be these enforcement tools—hopefully not to use them but to reach those results faster and in a proactive way.
:
Yes, absolutely. The use of sandboxes is good practice. Our British counterparts are very far along on that front. In the case of AI technologies, the industry gets to test out the data and methods in a secure environment.
It's clear that our office would have to be resourced to set up a sandbox. The bill doesn't go as far as establishing a sandbox, but it does require my office to provide the industry with advice as needed. That will be especially important for small and medium-sized businesses. Again, though, it will require capacity. The bill also calls on the commissioner's office to approve codes of practice and certification programs.
Those are all proactive and preventative measures. My recommendations on PIAs and privacy management programs are also prevention-oriented. That's the approach. Organizations have to do these things in the beginning and invest the necessary resources.
The OECD surveyed business leaders and legal experts to find out what challenges they were facing, challenges related not so much to AI, but, rather, to international trade. They said it was sometimes hard to know where to allocate resources because certain investments didn't yield any legal benefit or it was unclear.
Even if well-intentioned business leaders want to set up a sandbox, convincing shareholders to fund it is a challenge. Imposing a legal requirement on companies is helpful, because it sends the message that not only is it the right thing to do, but it's also required of them under the law. The same applies to PIAs.
By the way, I'm quite fond of the certification program provisions in Bill . Europe has that mechanism, and what it basically does is encourage companies to develop the programs and seek the commissioner's approval. Doing this and following the process will help them when complaints arise, because it shows that they acted in good faith and were proactive. It could even lessen fines.
All of those measures encourage companies to move in the right direction. Incentives are extremely important. To encourage innovation and ensure that Canada is well positioned, we have to act on two fronts: impose fines in problematic cases, and reward and recognize good behaviour. They go hand in hand.
My office's mission is to promote and protect privacy rights, and I really appreciate that. It's about more than telling people they did something wrong after the fact. It's also about working alongside them to make sure things are done right from the start.
:
We have a lot of discussions, in particular with the FTC in the U.S., which has jurisdiction for antitrust law. It is the equivalent of our Competition Bureau. Also, through that, it deals with privacy.
There's no national privacy legislation in the U.S. at the moment. There are proposals before Congress on this, but it's not moving forward. California has its own model, and they have some innovative mechanisms there to protect privacy. Nationally, there is no equivalent in the U.S.
We are in close discussions with those colleagues about AI. In fact, when I was in Japan last June, we issued a statement on generative AI. This was from all of the G7 commissioners for privacy, and for the U.S., that was the FTC. In that, we noted a few things. We noted that there are laws that apply to AI for privacy, and they need to be applied and they need to be respected. It also highlighted that we need to have privacy impact assessments. We need to have a culture of privacy when we're dealing with generative AI, because in many cases it is built on personal information.
There are a lot of exchanges that are going on in that space. I think the consensus is making sure that our citizens are aware that, yes, AI is moving at a fast pace, but we have privacy laws to protect citizens.
:
There's nothing moving in the United States Congress right now. They can't agree on a lunch, let alone a Speaker, so there's not going to be a lot of movement there.
I guess what I worry about, though, in terms of the larger, broader picture, is the corporate influence on the United States' legislatures with some of the lobbying that can take place. We have a containment factor to a certain degree here, aside from persons being able to get some supports later on, but it's nothing near what the United States has. We just want to keep that in mind as we go forward with the United States.
You mentioned Japan. Very quickly, before I lose my time here, what about Europe? What are your connections there, and what is happening?
:
Yes, we're working very closely with Europe, with the G7 and the community worldwide.
I was at a meeting of the Global Privacy Assembly just last week, talking about ethical uses of AI and highlighting the fact that we need proactive, strong privacy protections.
Going back to the collaboration with the G7 again, working very closely together, we issued a statement on AI. I was pleased to see the statement cited in the voluntary code issued by the Department of Industry to deal with AI, reminding organizations that we currently have laws that apply and that have to be respected.
However, in this new bill, that's why I am highlighting that we absolutely need to make sure that protective, proactive privacy assessments are there, and that they are a legal obligation. Right now, in the public sector, there's no obligation for privacy impact assessments. It's in a policy of Treasury Board. Often we'll see that if those impact assessments are done, they'll be done later. Therefore, it's important to have that legal obligation.
I can tell you that the international community is very much focused on AI, as are the G7 ministers. As you said, the debates are going on in the U.S., but certainly what is being highlighted and noted is that you cannot separate privacy from AI. To protect AI, to deal with AI, to have guardrails, you need strong privacy protections.
Thank you to our witnesses, as well.
Mr. Dufresne, the Liberals' new privacy bill was introduced 18 months ago, but it sat on the shelf for a year before they brought it to the House for debate. If I understood correctly, you didn't have an opportunity to comment on the bill in your capacity as privacy commissioner, since you took office when the bill was introduced. Your office did, nevertheless, have a chance to provide feedback on the bill.
Today, you told us that you made 15 recommendations. The minister proposed eight amendments, which you are no more privy to than we are. Here we are, asking you questions about a version of the bill that is by no means the last.
As the committee's proceedings continue over the next few months, do you expect to be able to give us your view on future iterations of the bill containing the Liberals' amendments? Knowing your opinion of the bill would help us in our assessment, especially since this is clearly not the final version.
On top of that, what do you make of the fact that the minister acted on only half of your recommendations? He accepted only four or five of the 15 you made. Why do you think he didn't accept more of them?
:
What I can tell you is that I saw the proposals in the letter, but at the end of the day, I certainly have to see the actual text of the amendments to know where things stand.
Nevertheless, I can give you my views on what isn't in the letter.
The minister mentioned four issues on which he was prepared to move forward. I talked about the privacy right, which, as it's been described thus far, certainly seems to be more in line with my recommendation. That's also true for the protection of children's privacy, but we'll have to see the actual amendment. I have less information about the voluntary agreements, so I can't really say everything is satisfactory, since I haven't seen all the details. Lastly, he addressed my recommendation on co‑operation between regulators.
Earlier, we talked about the Federal Trade Commission, or FTC, in the U.S. Under the bill, my office can work with the FTC on joint investigations—as we've done in the past—but we can't do the same with the Competition Bureau of Canada. It seems counterintuitive to me that I can co‑operate more with other countries than with my own. That's something that will have to be resolved given the growing overlap between privacy rights, human rights, competition rights and copyright. We see that in the AI world, but elsewhere as well. Working together can be advantageous for everyone, Canadians and industry.
I did put forward eight other recommendations that the minister did not say he agreed with. I want to list them right now, since there's nothing on the table about—
:
That's great. Thank you, Chair, and thank you, Commissioner.
Canada's privacy legislation is 20 years old. During that time, we had Facebook come about. We had the iPhone released. We had social media become prevalent. It's actually very alarming that we haven't updated it in 20 years.
I come from the generation that was very young when, for example, Facebook was released. I think individuals—I'm not in that category, by the way; I was very careful—posted information that they perhaps did not want to, especially looking at it in hindsight.
We know that in the new legislation, there's an expansion of the personal information that individuals can request be disposed of. What do current laws cover, and how does the bill strengthen the ability of Canadians to have their personal information disposed of?
:
There are obligations for organizations to proactively prepare privacy management programs and to share some information about them. If it's information online, the idea is that you would see it online, but there are obligations for organizations to make it as useful.... Obviously, if organizations see information about themselves and they have challenges finding where it is, they can reach out to my office and we can assist in seeing what's going on there.
That touches upon a point of transparency and making sure Canadians can understand what's going on, because not everyone is an expert in technology, yet we are living lives that are very much digital. Understanding what's going on, certainly with respect to AI, the notion of algorithmic decision-makers and....
We hear a lot of comments about that. We have our surveys of Canadians, and we see that Canadians are concerned about the protection of their privacy. I think part of the solution to that is communication, making sure that Canadians can understand what's going on, what the institutions are that protect them, what their rights are and what is being done with their information. Sometimes, we can have an impression that's worse than reality.
That's why I'm recommending that there be strong transparency. In the bill right now, organizations that make AI decisions about Canadians—if they have a significant impact—have to proactively explain the general processes of those decisions. They also have to answer questions if there's a request, but that's only if that decision has a significant impact on Canadians.
My recommendation is that if a Canadian asks for an explanation, they should receive the explanation, even if it doesn't have a significant impact on them. It still has an impact on them. They want to know what's going on. I think it's beneficial for Canadians to understand what's being done with their data.
:
That's great. Thank you.
My colleague, Mr. Turnbull, asked about the example of Tim Hortons, and how the app was tracking the data.
The example I want to reference happens all the time. I am in a room, and there might be a device in the room, and I am talking about a particular product with my friend. I'll go into the other room, where my computer is, and I'll go online and see an ad for that very product. It could be a very particular product at that time, so you wouldn't normally expect to see that ad appear.
What is this bill doing to protect Canadians from that? What penalties can companies face if they don't comply with these new laws?
:
That raises the whole issue of consenting to being tracked, and privacy by default. What are organizations doing, and what are they telling you those default provisions are? In my view, the default provisions should be privacy protective, certainly if you're dealing with minors. It's about making sure you have these protections in place, that you understand what they are, and that you can ask.... Again, it's that transparency. You gave a perfect example. You see this, and you want to know why you are receiving this. It's being able to get that explanation and to understand whether you have consented to this and how your consent is interpreted.
This is what we've seen in some of the investigations that my office has done, whether it's Tim Hortons, Home Depot or, more recently, Canada Post. It is the sense that Canadians don't know what's being done with their information, and sometimes there is a disconnect between what organizations believe Canadians agree to and understand, and what actually is going on.
That transparency that the explanation.... Again, it's the privacy impact assessments and consulting my office to make sure you develop these reflexes. Privacy is a priority. It's not something you do after the fact. You're designing and thinking. You need to think about innovation, absolutely, but they're not mutually exclusive.
When I was appointed, I said that privacy is a fundamental right, but it's also not an obstacle to innovation. It's not an obstacle to the public interest. We can have both. Canadians deserve to have both, and tools like this will help organizations get it right. My office will be there to help organizations get it right, particularly small and medium enterprises.
:
Thank you, Mr. Chair, and thank you to the witnesses.
The last time I spoke to you, I brought forward a motion to get some of the answers as to why the government presented a broken bill. I think we got some results, and we're seeing some productivity now: Some of that information has been provided, and I can see that it's been very useful to you as well, Mr. Dufresne, so that's positive.
Some of my colleagues already touched upon the changes the has mentioned regarding section 12 of the proposed act. I want to dig into that really quickly.
However, first off, as I understand it now, when the current form of the bill, which is broken, is amended, we're going to see a children's right to privacy defined in the bill. Is that correct?
:
It's crucial to ensure that the regime can be met by small and medium enterprises, absolutely. The bill provides for a role for my office in terms of guidance. It provides the ability to join in certification programs and codes of practice. It's something that has to be taken into consideration. It's certainly something that I'm very mindful of.
To the point on competition and privacy, this is an example in which you can have overlap between competition and privacy. We need to make sure that protecting privacy doesn't harm competition and vice versa. We've made recommendations to Parliament and to the department on competition law review, to make sure you are dealing with what we call “dark patterns”, which are manipulative uses of language and psychological tools to incite individuals to make wrong choices, either from a privacy or competition standpoint.
This is why, in the last few months, my colleagues, the competition commissioner and the CRTC chair, and I created a digital regulators forum. We are working together to identify these areas of connection and interoperability. There are similar groups internationally. Our first focus right now, in our first year, is AI and making sure we are on top of those new developments.
This is why my 15th recommendation is to expand the scope of my office's ability to collaborate with regulators like these, in particular in the context of complaints. Right now I can't do that with my Canadian colleagues, but I can do it with my international colleagues.
:
We need to make sure that the law will hold up despite the rapid evolution of technology, if not with it. There's a lot of talk about generative artificial intelligence right now. A year from now, it'll be even more powerful. Who knows? So the law has to be able to adapt. That's why the bill contains principles and doesn't talk specifically about generative artificial intelligence, for example, but rather about automated decisions. The definitions need to encompass all this and there needs to be flexibility for the government to set regulations and for my office to set guidelines so we can adapt to new technologies.
The recommendation we're making on privacy impact assessments is very important in this regard. Every time we develop something, we have to document it, assess the risks and carry out consultations, precisely to stay ahead of these technologies. This is one of my priorities, along with protecting children's privacy. We have to keep up with the evolution of technology. This measure makes it possible.
Another of our recommendations concerns de‑identified information. De‑identified information is defined a little too broadly, in my opinion, particularly in French. This definition must be very strict, because it limits legal obligations. In these definitions, we must also take into account the risk of “re‑identification.” The bill says that more can be done with de‑identified information, and that if it's anonymized, the law doesn't apply at all. So there's a big responsibility that comes with that. These definitions need to be strict.
On the issue of de‑identified information, I recommended that we take into account the risk of “re‑identification,” because technology evolves. If a piece of information is de‑identified today, but in two or three years' time, thanks to technology, we'll be able to know again who it's linked to, we'll be right back where we started. This has to be able to evolve over time.
:
That would be the model, in fact, that exists in Quebec and that exists internationally. In fact, the GDPR—which, as you know, is the regulation that applies to the European Union—states that the DPAs, which would be the privacy commissioners, have the ability to issue fines. In the recital, in the description of this, they're talking about the DPAs issuing the fines, and they're generally reviewed by the courts. They list Estonia and Denmark as being exceptions, where they have to ask courts to issue fines because of the specifics of their legal structure.
The CAI, my counterpart in Quebec, has the ability to issue fines. They are reviewable by the normal court system. If there were no tribunal, this could work in the same way. Bill, as drafted, already creates a more formal process for my decisions. It provides that the investigations happen at the front end. You try to resolve matters. If you don't resolve the matter, then it goes to what is called an inquiry, and I will have obligations under the law to adapt codes of practice and consultation with industry. Procedural fairness has to be an element of that, and at the end of the day, those decisions, if you choose as a Parliament to give the authority to my office to issue fines directly, would be reviewable by the Federal Court through the normal judicial review process. That's certainly an option.
On the other option, if the decision is to create a new tribunal, my recommendation is that if we're adding a layer of review, we should remove one, so it should go straight to the Court of Appeal, otherwise there will be a cost.
:
Thank you, Mr. Chair, and thank you, Commissioner.
We had a discussion earlier about the preamble having no legal binding, that it's a statement of intent and, once passed, doesn't appear in the statute in Canada. The purpose section, then, is very important. You just said in response to other questions that if you put “fundamental right” in and the word “and”, “and” is there balancing against that, but that's okay. It still makes “fundamental right” prominent.
I'll go in a different direction on that. Let's assume that's correct. The Liberals are introducing into this bill a concept in privacy protection, the concept of a legitimate interest, the legitimate interest of the business—the big business and its legitimate interest to use one's data in a way in which it doesn't have permission to use it, and to allow it to use it even if it causes harm to the individual.
I would argue that proposed section 18, which introduces this concept, actually does not make the privacy of the individual of paramount importance. Proposed section 18 actually makes business interests more important, because a large business can ignore whether or not you gave it permission. It can ignore whether or not the information being used is going to harm you for its own legitimate interests, which are not always aligned with those of an individual.
Would you not agree that having that “and” gives that power in proposed section 18 much more weight, enabling them to ignore whether or not it's a fundamental right?
:
That takes away the power from Parliament and leaves the judgment in the hands of bureaucrats as to what the list is, because that's where regulations get made.
I would also argue that the Liberals are further watering down this issue. When you look at the terms of express consent in proposed section 15.... For those watching, express consent means I have to give you permission to use my data. The Liberals have designed a number of escape clauses from express consent that allow businesses to get around it. Those escape clauses in proposed section 18 allow them to get around it.
Also, in proposed subsection 12(4), in the purposes of the bill, it reads that where a business needs to use a person's information for a new use, it doesn't have to get the person's permission. It just has to record it somewhere. There's no need for consent from the person if the business uses it for a new use.
As we know, this is evolving rapidly. I got somebody's consent five years ago. I decide to use their stuff in a different way. I just have to record it somewhere now. I don't even have to tell the person I'm using it. It's a further watering down of the person's protection as a fundamental right, giving much more power.... When you combine it with proposed sections 18 and 15, the exceptions, and with proposed subsection 12(4), that's giving enormous power to a business to do whatever they want with that individual's data without their permission.
There's some really great conversation today. I really appreciate all the questions and the positive engagement here. I think this is really good for this work and this legislation.
I want to get back to a line of questioning that I started on and didn't quite finish.
I sort of take it that Bill and the 's letter, which provides details, introduce new obligations for organizations and companies. It's also giving your office and you new powers, which I think are both positive.
One of the questions that keep coming up for me when considering what work you'll have ahead, once we hopefully get this bill through and strengthened in many ways, is whether there is enough around detecting non-compliance. It seems to me that it must be hard to detect who is not complying with these additional obligations that are being introduced in Bill .
Can you speak to how you'll undertake that? I know you mentioned it with the last question about additional resources needed. I'm certain that's part of it, but could you speak to how you'll detect non-compliance when it does occur?
Certainly, resources are part of it, because there are a number of things here, whether they are audits, certification programs, guidance or communication, to make sure Canadians can flag things for us. There's also the technological aspect. We have a technical lab at the OPC, where we are trying to stay ahead of the evolving technology, and getting those resources, that expertise and that understanding will be important. Using all those tools, compliance and the request for information, there are obligations in there where I can ask to see certain information of organizations or their privacy management programs.
Certainly we'll need something put in place so that we are not in a reactive mode but are aware of what's going on. I have to say, we have good engagements with representatives in industry and academia. I think this will continue, both in Canada and internationally, to make sure that we are hearing what the trends and concerns are and can act on them.
:
I'm not trying to give you ammunition to ask for even more funds, but I suspect that there's a lot of online activity and a lot of data being collected, and I suspect that it would be very challenging to try to monitor and detect any breaches of the obligations that would be in Bill . With respect, I think you have your work cut out for you in the future. I don't envy you that, but I appreciate the work that will be undertaken.
Maybe I'll leave it there for the moment on that.
I have another question or two. On the flip side of this—and I think my colleague Mr. Van Bynen asked some questions about this—what are the risks in going too far? By “going too far”, I mean are there risks within this legislation and this debate we're having, such that we could go a step or two too far and impede all of the positive benefits Canadians are getting out of the use of these online tools?
The data that's collected has enhanced our lives in a lot of different ways. There's a sense in which there's that balancing act between innovation and privacy, which you've already talked about. I guess I want to know specifically whether you see any risks in going too far. We really have been talking from the other side, about not going far enough on privacy rights. If we go too far, we might also stifle innovation. Would you agree with that, and are there any risks to that?
:
We have to strike the balance, but we have to remember that we are dealing with a fundamental right, so we need to start with that premise. We need to make sure we're protecting the fundamental right to privacy, because it's core to who we are as a society and as individuals.
Absolutely, though, we need to do it in a way that supports innovation. We need to do it in a way that puts Canada in a competitive situation and that allows Canada to work and trade on the world stage.
The good news, in terms of protecting privacy, is that it actually gives us economic advantages in many ways, certainly in terms of Europe and being recognized by that system as providing adequate levels of privacy protection. That's not just good for privacy; it's good for trade, because it allows our companies to trade with Europe in a better way.
There are benefits there, but absolutely we need to make sure, and you need to hear from industry. I've heard from industry. I have good dialogues with them. They may not always take the position I do, and that's okay. However, I can tell you that we have regular discussions and exchanges. They will be coming in front of you, and they have a valid perspective to bring.
I was sort of leading up to this, which is interesting. With very much respect, I'll say.... As the Privacy Commissioner, advocating for the fundamental right of privacy, it seems to me that you might naturally be inclined to support one side of this debate. I can hear that you definitely appreciate the side of innovation and industry, which, in a way, is the 's responsibility.
There also may be a flip side to this. Those industry stakeholders would express their position with regard to where we could go too far and limit the benefits that are also very important for this work. I wanted to put that out there.
Respectfully, I hope we can continue to have a very collaborative working relationship as we move forward. I would expect nothing less, because that's been the history so far in our working relationship.
Thank you very much for being here today. With great respect, I really appreciate your testimony.