:
I call the meeting to order.
[Translation]
Welcome to meeting No. 100 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics. Pursuant to Standing Order 108(3)(h) and the motion adopted by the committee on Wednesday, December 6, 2023, the committee is commencing today its study of the federal government's use of technological tools capable of extracting personal data from mobile devices and computers.
[English]
Today's meeting is taking place in a hybrid format, pursuant to the Standing Orders. Members are attending in person in the room and remotely using the Zoom application.
I just want to remind all members again not to put the earpieces next to the microphone as it causes feedback and could cause potential injury to our interpreters.
I'd now like to welcome our witnesses today. From the Offices of the Information and Privacy Commissioners of Canada, we have Mr. Philippe Dufresne, the Privacy Commissioner of Canada. Welcome, sir. We also have Lara Ives, executive director of the policy, research and parliamentary affairs directorate.
Before Mr. Dufresne begins, he has asked for up to 10 minutes to address the committee. I've granted that.
The other thing I will remind members of is that since we have only these two witnesses for the next two hours, we will reset the clock at the top of the hour and give Mr. Villemure and Mr. Green the additional time that they need.
Mr. Dufresne, again, welcome, sir. It's good to have you at the committee, as always.
Please commence with your opening remarks.
:
Thank you, Mr. Chair and members of the committee, for the invitation to contribute to your study on the federal government's use of technological tools capable of extracting personal data from mobile devices and computers.
Last fall, CBC/Radio-Canada reported that 13 federal institutions had acquired such tools. The media reports raised questions about the reasons for their use and whether these organizations were respecting their privacy obligations in using the tools.
Initial reports referred to them as covert surveillance or spyware. Since then, it has been clarified that the tools are digital forensic tools, which are distinct from spyware. Digital forensic tools are used to extract and examine large numbers of files from laptops, hard drives or mobile devices. They are typically used in investigations or technical analysis, and often with the knowledge of the device owner.
[Translation]
They can be used to analyze the metadata of a file, or to create a timeline of events, such as when an account was used, when websites were accessed, or to see when an operating system was changed. These tools can also be used to recover deleted data or to ensure that data has been properly wiped from a device before it is discarded or repurposed. This makes them useful investigative tools that can help to preserve the integrity of an evidence chain.
Digital forensics tools are distinct from spyware in that spyware is typically installed remotely on a person's device without their knowledge. It can then covertly collect personal information, such as keylogging and web‑browsing history. One example would be on‑device investigative tools, or ODITs, which are used by law enforcement to obtain data covertly and remotely from targeted devices. Importantly, in the context of law enforcement, judicial authorization is required prior to their use.
[English]
In August 2022, I testified before this committee as part of your study about the use of ODITs by the RCMP. You will recall that in that case, the RCMP advised the House that it had been using ODITs in recent years to obtain data covertly and remotely from targeted devices, but had not completed a privacy impact assessment, or PIA, and had not advised my office.
In my appearance at the time, I noted that PIAs were required under Treasury Board policy, but were not a legally binding requirement under privacy legislation. I recommended that the preparation of PIAs should be made a legal obligation for the government under the Privacy Act.
[Translation]
In its November 2022 report, the committee endorsed this recommendation and also called for an amendment to the preamble of the Privacy Act to indicate that privacy is a fundamental right, and for the act to be amended to include the concept of privacy by design and explicit transparency obligations for government institutions. I welcomed and supported these recommendations, and the committee may wish to reiterate them as they remain outstanding and relevant.
[English]
With technology increasingly changing the manner in which personal information is collected, used and disclosed, it continues to be important that government institutions carefully consider and assess the privacy implications of their activities to determine if and when PIAs are required.
My vision for privacy is one where privacy is treated as a fundamental right, where privacy supports the public interest and innovation, and where Canadians trust that their institutions are protecting their personal information. Conducting a PIA and consulting my office before a privacy-impactful new technology is used would strengthen privacy, support the public interest and generate trust. This is why it should be a legal obligation for government institutions under the Privacy Act.
[Translation]
Currently, the Treasury Board Secretariat's directive on privacy impact assessment requires that institutions conduct PIAs when personal information may be used as part of a decision‑making process that directly affects an individual; when there are major changes to existing programs or activities where personal information may be used for an administrative purpose; when there are major changes to existing programs or activities as a result of contracting out or transferring programs or activities to another level of government or to the private sector; and when new or substantially modified programs or activities will have an impact on overall privacy, even where no decisions are made about individuals.
[English]
In our advisory discussions with federal institutions, we promote the use of PIAs as an effective risk management process. PIAs ensure that potential privacy risks are identified and mitigated, ideally at the front end, across programs and services that collect and use personal information. That said, the use of a new tool does not always trigger the need for a PIA. This will depend on how the tool is being used and what is being done with the information that it collects.
The OPC has used digital forensic tools, for instance, in the context of certain breach investigations to determine the nature, scale and scope of the incident, including how a breach occurred and what types of personal information, if any, may have been compromised.
[Translation]
Digital forensics tools, however, can be used in ways that do raise important risks for privacy that would merit a full privacy impact assessment.
For example, when conducting an internal investigation about an employee's conduct where a decision will be made that will directly impact that individual, or as a tool used as part of an inquiry into alleged criminal activity.
In those types of cases, a privacy impact assessment would be required—addressing not only the specific tool being used to collect personal information, but the broader program under which the tool is being used.
[English]
It is incumbent on all federal institutions to review their programs and activities accordingly. Where digital forensic tools are used in the context of employee monitoring, institutions must take steps to ensure respect for the fundamental right to privacy and foster transparency and trust in the workplace. There should be clear rules about when and how monitoring technologies are to be used. My office updated its guidance on privacy in the workplace in May 2023, and my provincial and territorial colleagues and I issued a joint resolution on employee privacy in October 2023.
In the present case, following the CBC/Radio-Canada reports regarding the use of digital forensic tools in the federal government, my office followed up with the institutions that were listed there and in this committee’s motion to proceed with this study.
[Translation]
To summarize what we learned, three organizations indicated that they had completed and submitted a privacy impact assessment—or PIA—on the relevant program; one organization indicated that it had procured the tool but never used it; another organization indicated that a PIA was not required; and the remaining eight organizations indicated that they had either started work on a new PIA, or were considering whether to conduct a new PIA or to update an existing one in light of their use of the tools.
[English]
We will continue to follow up with institutions to insist that PIAs be completed in cases where they are required under the Treasury Board policy, but without a requirement in the Privacy Act there are limits to what we can do to ensure compliance. Privacy impact assessments, in appropriate cases, are good for privacy, good for the public interest and they generate trust. In this increasingly digital world, they should be a requirement under privacy law.
I'd be happy to take your questions.
:
I don't have all the details of what they would be doing. That could be asked of them.
Generally speaking, you would be talking about the tools that are provided to the employee by the employer—the email, the laptop and these types of things. Again, nonetheless, there are some expectations of privacy vis-à-vis these tools, but it's contextual. Employers have legitimate reasons for obtaining certain types of information. We talk about that in our guidance and really highlight it: Make sure you've assessed the tool. Make sure you've assessed the necessity and proportionality of it. Make sure you are transparent about it and people know.
In our annual report last year, we talked about one of our investigations in the private sector where a trucking company was using a monitoring device for truck drivers. Even when they were not on duty, they were being filmed and recorded 24-7. We found that this was too broad. It was legitimate to do it when you were driving, for safety reasons, but it had to be limited to that. That was done.
This is the type of questioning that goes on with regard to the privacy impact assessment. When my office is consulted, especially before it's initiated, then we can raise these types of questions. Let's prevent these things. Let's prevent Canadians worrying about it so that they can feel like, “Okay, this is a tool and here's what it does. The Privacy Commissioner's office was consulted and provided input.”
That's what I'd like to see more of, especially in situations where we often learn after the fact that something was being used.
:
The policy is an internal rule that the government imposes on itself, so it's a directive that would be issued, in this case, by the Treasury Board. It says, here are the expectations that we have of the department. It's certainly important but it doesn't have the same binding legal force, and it certainly doesn't allow me to conduct an investigation in the same way as if it were in the Privacy Act. That's why I'd recommend, and the office has recommended, to make it a legal obligation. I've recommended this for the private sector as well, especially vis-à-vis AI because I compare this to predeparture flight checks in airplanes. It's something that will bring comfort and reassurance when we're using powerful tools.
In instances like this we've reached out to the departments. We have regular consultation with departments, and we have a government advisory team that's always on standby to hear consultation from departments. Again, what we see sometimes is, “Okay, we will now do a PIA. We will now update it, and we have a program.” Sometimes we're told that this is authorized under their program legal authorities, or they are doing it under a warrant. We have to remind those departments that, even if you're doing it under a warrant or under a valid legal authority, the privacy impact assessment is a separate question. You may still need to do that if your legal use of that tool nonetheless impacts the privacy of Canadians.
It's an extra step, and if it were a legal obligation my belief is that we would see more compliance up front rather than situations like this, where sometimes people find out about it through important media reports. Again, it may well be that these tools are appropriate for their purposes. They're distinct from spyware. They're distinct from ODITs. Even ODITs in appropriate cases may be acceptable, but having that discipline and having those PIAs seen to be done builds on that trust that Canadians can have to say, “Okay, I don't have to watch over my shoulder constantly. The institutions themselves have these tools and these reflexes.”
:
I think there are all kinds of challenges, whether in terms of resources or the pressure on departments. They're in a better position to speak to that than I am.
The challenge is that privacy impact assessments are mandatory under the Treasury Board directive but not under the act. The directive makes distinctions, for example, between a new program and the update of a program, or between the assessment of a program and the assessment of the tool itself.
Given these distinctions, the department can say in good faith that it is of the opinion that an assessment isn't required, because the directive doesn't require it. And yet, perhaps it should be required. With technology becoming increasingly powerful, it could become even more important to reassure Canadians that we're doing all this in an even more proactive manner. So it would be preferable that it be a legal obligation.
Moreover, this is not an issue that concerns only Canada, obviously. My international colleagues, at the conference of the Global Privacy Assembly, adopted a resolution on artificial intelligence in the area of employment. It calls on governments and parliamentarians to be aware of the need to set guidelines. If artificial intelligence technologies are used to recruit workers and assess their performance, that can have an impact on privacy. So we have to be transparent and take into account the notions of necessity and proportionality. These are fundamental questions.
:
In terms of the public sector, again, this notion of proportionality is not included in the Privacy Act. We recommended, and this committee did as well, that the issue of necessity and proportionality be included. At this point, it is more a Treasury Board directive that this use is necessary to achieve the desired objective.
Currently, the act requires that the use be related to a mandate of the organization. For our part, at the Office of the Commissioner, we will implement that necessity and proportionality by raising questions about it in our investigations. We're talking about it now, just as we talked about it during the investigations into the measures taken during the pandemic, in particular. When we talk about this, though, we have to recognize at the outset that this is not a legal obligation and that, if it were not respected in a given situation, it wouldn't be a violation of the act.
This is a very important recommendation. The approach is very similar to how we proceed in the context of the Canadian Charter of Rights and Freedoms to determine whether there is discrimination or a violation of fundamental rights. We determine whether the objective sought is important, whether the proposed measure achieves the objective, whether the method used to achieve it is the least intrusive and, lastly, whether the method is proportional.
You're absolutely right: We may be tempted to use a tool because we find it very efficient and quick. Artificial intelligence comes to mind. Yes, it's effective, but we're talking about a fundamental right here.
Having said that, it's not an either‑or. Personally, I'm in favour of technology. In the office, we have made it one of our three strategic priorities recently. We want to use technology, but in a way that protects privacy. In that sense, the privacy impact assessment tools are essential. These assessments must not only be done, but also be seen to be done.
You know, I'm thinking back to the work that we conducted on the RCMP, and my hope is that, at the time, these departments may have been tuned in, knowing that they were actively engaged in similar activities. My disappointment is that it took them this long to kind of come clean. There are 13 departments. Certainly, there are many more federal departments, some that may or may not be declared. I won't impugn what the other departments are doing.
I do note that in the language of the directive on privacy impact assessment, in paragraph 3.3, it states that the Privacy Act requires “assessing the privacy implications of new or substantially modified programs and activities involving personal information”. I believe you just referenced this, sir. Then the next line says, “However, if not properly framed within an institution's broader risk management framework, conducting a PIA can be a resource-intensive exercise.”
How resource-intensive is it?
Thank you, Commissioner, for being here. Thank you to your team for their work.
I just want to highlight something for those listening and watching. There are 13 government institutions that were called into question in the article that's been mentioned. It's Fisheries and Oceans Canada, Environment and Climate Change Canada, the CRTC, the CRA, Shared Services Canada, the Competition Bureau, Global Affairs, the Transportation Safety Board of Canada, Natural Resources Canada, Correctional Service Canada, the Canada Border Services Agency, National Defence and the RCMP.
I think, like many Canadians, that some of those are not surprising. I think it's disappointing that they did not conduct PIAs, as was referenced. The question around trust is certainly highlighted here.
Commissioner, do you believe that this information would have come to light had it not been reported? Is there a reporting mechanism within government that would have said, these tools are used and here's the number of times? Had it not been for this article that references this, would this information have come to light otherwise?
I'd like to thank the witnesses for their attendance today. This is a very important and serious issue for not only Canadians but also the public service. I want to start by looking at some legal principles.
Every Canadian, and that includes every public service employee, has an absolute right under the Canadian Charter of Rights and Freedoms to be secure against unreasonable search and seizure, pursuant to section 8. Although you've testified, sir, that in some cases legal authorization was obtained, you can't say for certain that in all cases authorization was obtained. That raises charter considerations. A breach of a section 8 right is a serious violation that, hearkening back to my years as a Crown prosecutor, quite often was not met with success. There are strict consequences for the privacy rights of Canadians as upheld by courts across this country.
That backdrop was important for me to frame this question. When I look at the 13 institutions that were identified, there's no guarantee that these are the only 13 institutions that have been using this technology. Is that a fair assessment, sir?
:
Yes...because these are the institutions that were identified by the CBC reporter. Is that correct?
Mr. Philippe Dufresne: Yes.
Mr. Larry Brock: When I take a look at this list, I might give some consideration to Fisheries and Oceans Canada, and perhaps the Competition Bureau, but when I take a look at Canada Revenue Agency, Global Affairs, Correctional Services, Canada Border Services Agency, National Defence and, most importantly, the RCMP, they all have great legal teams working behind them—in many cases the Department of Justice—who would certainly instruct not only management of those departments but its employees about the protection of privacy rights. To learn, then, that in many instances judicial authorization was not authorized, that a PIA was not submitted for your consideration and that data was collected, raises serious privacy concerns.
You mentioned earlier, sir, I think in your opening statement, that in many cases when this device was used on public sector employees it was with the consent of the device owner, but you can't say that in all cases the extraction of data pursuant to this software already had the consent of the device holder. Is that fair to say, sir?
:
We talk about that in our “Privacy in the Workplace” document that we revised in May 2023. It's really talking about the monitoring and the transparency. To your point, if you as an employee are aware—here's what the employer can and can't do, here's what the tools of the employer can do if you use this tool—then you have that awareness as the user. You have that transparency.
There may be circumstances where it's absolutely warranted for the employer to have access to certain things. However, even if the information is there on the phone, why would the employer need to have access to that health information of yours? You put it there, perhaps rightly, perhaps wrongly, but does the employer need to have that?
How do we balance that—limiting the use, limiting the collection, and that transparency? We have to modernize and apply these rules to evolving technology. It was much easier before, because, as you say, with these devices so much of our lives are so much easier to mix up.
We were talking about the RCMP's use of ODIT before, and that was really done because the wiretaps weren't working anymore. People weren't using landlines. However, the landline didn't give you nearly as much information as the phone. That's an example of a different tool, but it also is of a greater magnitude.
:
Thank you very much, Mr. Chair.
The committee has requested a review of the Privacy Act on numerous occasions. Treasury Board announced a review of the act in 2021. As part of that review, I believe every Canadian is being consulted individually, which is time-consuming. The committee has already made various recommendations. Mr. Dufresne is with us this morning and I believe he has told us a million times that a review is needed. So we have to emphasize the need to discuss this again because it seems that the recommendation has not been taken seriously.
It's like anything else: Repetition eventually becomes untenable. We have seen built‑in tools. There have been other studies about privacy. Every time, a review of the act was recommended. All of this will ultimately lead to something.
I think the committee is in agreement since we have heard the same testimony. The commissioner who is present and his predecessor told us the same thing: There is cause for concern. AI will completely change the situation. Even if the tool changes and the 1983 act remains in effect, the rest of the world has changed.
After making the first recommendation in two reports, we have to support the motion in the public interest. I want to stress that again.
:
I appreciate that, Mr. Erskine-Smith.
If this motion is adopted, this can be a motion that goes into the minutes of this committee's report and then of course, as we deal with the draft report after all the witnesses are done, it can be prominent and prevalent in that report. In the meantime, this is what Mr. Villemure has proposed and what we're dealing with right now.
I appreciate your input on that, Mr. Erskine-Smith.
[Translation]
Do we have a consensus on Mr. Villemure's motion as amended?
Voices: Agreed.
(Motion as amended agreed to)
[English]
The Chair: Thank you again for your patience, Mr. Dufresne and Ms. Ives.
Monsieur Villemure, you have the floor for six minutes.
Go ahead, please.
:
AI is a key element that affects the privacy of Canadians and people around the world. This summer, my G7 counterparts and I issued a resolution at our annual meeting reiterating the importance of protecting privacy. We reiterated the importance of implementing current legislation and the need to modernize that legislation and consider the effects of AI from the outset.
In October, my provincial and territorial counterparts and I also issued a statement. On December 7, we held a symposium here in Ottawa with our counterparts from other countries, and issued a statement about AI based on Canada's privacy principles. We also stated our expectations, specifically regarding legal authority, appropriate objectives, necessity and proportionality, accountability and limits of use. We applied that lens to AI.
With regard to the tools under discussion today, I have not been told that this involves AI, but that is a possibility we must certainly bear in mind. With regard to employment, the resolution issued by the Global Privacy Assembly last fall referred specifically to the use of AI in employment matters, including staff management and recruitment.
I cannot go into any details, but right now we are looking into a complaint against OpenAI to determine whether the company is in violation of the act with ChatGPT. We are also considering what to recommend if it is in violation.
There are also all the issues relating to the data that is used to train AI. What is protected and what are the limits? The Organization for Economic Cooperation and Development conducted a study on AI with G7 ministers, and the three greatest risks identified were disinformation or misinformation, the effects on copyright, and the effects on privacy. So this is a very important issue.
Last week, the Office of the Privacy Commissioner of Canada announced its three strategic priorities. The first is optimizing and modernizing the office's structure to ensure we have the maximum impact. The second is ensuring that technology respects privacy and that people can utilize it but with guidelines. The third is protecting children's privacy, another extremely important element. In addition, the CEOs of social media companies appeared before the U.S. Congress this week to talk about their impact on children. These priorities are at the heart of our work.
I appreciate being able to go back to the point about the human context of surveillance. We certainly delved into that when we talked about AI and its use for surveillance that was on-device in the audit response. We covered—I think, compellingly—the ways in which bias is baked in.
What I'm struck by in this way is surveillance that was more akin to CCTV. You know that in places like the U.K. and the United States, where there's an expansive use of CCTVs, they're found to be really susceptible to abuse, just based on human nature. The American Civil Liberties Union identified four ways in which CCTV is susceptible to abuse, so for the purpose of this round, I want you to just consider that context.
The first is criminal abuse. In instances like this, obviously if the federal government is doing it, the legality could suggest criminality if it's warrantless and outside the scope of their work. The second is clandestine, if we're talking about our RCMP, our national defence or perhaps more specifically our national security establishment and the way it does online surveillance. I'm not suggesting they're involved in that, but that is a possibility. The third is institutional abuse, the overreach, the top-down approach and the way in which government institutes surveillance on the public is a significant risk. CCTV was found to be not only ineffective but also, it was argued, an institutional abuse.
I think what I'm most concerned about with the sensitive nature of the information is the abuse for personal purposes, which is why I was trying to drill down on exactly who. I think, for anybody who's not aware of IT, we have an idea of who's on the IT side.
Would you agree that the intentions or the possibilities, the susceptibilities, for abuse at the level of CCTV or the analog level ought also to be considered at the deeply digital level, particularly as it relates to AI and the technologies and the full and complete access that it has to people's personal information and data? Is that a fair assumption?
I think the more powerful the technology is, the broader the scope, the more you have to be careful and the more privacy protections and considerations you need. That's what proportionality is. You have this more intrusive tool, so you need to have a more rigorous protection mechanism.
I agree with you. The human element is important. We're talking about privacy as a principle. It's a fundamental right, absolutely, but it means that, at the human level, we're all less free if we lose our privacy, if we're living a life where we feel that we're constantly under the microscope and that people can see what we're doing, where we're doing it, what we're buying....
I point to one of the earliest articles on privacy called “The Right to Privacy”. They gave the example in the 1800s of someone who was collecting rocks and said that privacy means you're allowed to do that and not everyone in the village gets to know which rocks you're buying. That's your information.
Today, obviously, we can see that it's even more powerful. This is part of our freedom and our individuality, so we need to make sure that reflex.... It's not to say that you can't use technology—you can—but we have to do this bearing in mind privacy.
The motion, as Mr. Green stated, had been sent to everyone's email prior to his moving that motion. The motion has been moved. I'm going to accept it because it is in relation to what we're studying today.
Is there any discussion on Mr. Green's motion? I don't see any.
(Motion agreed to)
The Chair: Thank you, Mr. Green. That concludes your time. The motion has been adopted.
We're going to go with five minutes, five minutes, two and a half minutes and two and a half minutes to conclude.
I know that everybody is aware that Mr. Barrett will be moving a motion at the end of this meeting. We should dispose of that fairly quickly.
We're going to commence our five-minute round with Mr. Kurek.
Go ahead, Mr. Kurek.
:
Thank you very much, Chair.
I think that the passing of that motion highlights a concern around the unknowns that exist. When I saw that Environment and Climate Change....
I represent an area with lots of farmers. There have been concerns highlighted to me by farmers who get correspondence from different levels of government. They don't know what certain demands are, what they mean or what's included in that. They're asked to agree to things that they don't necessarily have all the details about.
The fact that there are unanswered questions highlights how important it is to really get to the bottom of this. If that is impacting the privacy rights of Canadians, certainly we need to be very clear on that.
There's also the fact that Environment and Climate Change Canada recently had a job posting where they were looking for climate enforcement officers. What does that look like? Farmers in my constituency are asking what that looks like. I certainly would ask those questions.
Commissioner, you've outlined some of the concerns. I would, if I could, ask you to provide some specific examples of what needs to be changed in order for you to not only have the tools but to ensure that the legislative framework is in place so that the questions that we have asked—and all parties have asked—can in fact be answered. Those questions are not currently able to be answered because of gaps or because of regulatory frameworks that don't go far enough in Canada.
Could you outline some of those things?
I would recommend that a PIA be required when new, powerful tools can have an impact on the privacy of Canadians. Perhaps moving away from the notion of a program itself, if there is a new tool that changes the context, then consider a privacy impact assessment.
I would recommend making it clear that my office has to be consulted and advised before the deployment of new technology and before changes to new programs—not after the fact. In fact, it would be not just on the day of, but with sufficient time so that we can provide meaningful input.
I would recommend that the Privacy Act be modernized to include necessity and proportionality, which is this element and this discipline of saying, the goal may be important, but are we limiting the information that we're gathering to the minimum required?
Those would be highlights.
I echo the recommendation of the committee in terms of privacy by design. Also, of course, order-making powers for my office is something that is important and should be included in new law as well.
:
Right, and it seems to me that, in the course of your investigation here and in your questions, you ought to be reaching out.... It would be better—and I appreciate Mr. Green's motion—and you're better placed, actually, to reach out to these other organizations and ask those very same questions and then revert back to us. We could oversee your work because we have powers that you don't, but you have the time and inclination to do the detailed work of asking the questions.
In the course of asking those questions, it would be good to know how many times it's been used absent judicial or other authorization pursuant to existing due processes for investigations and—this is getting to my previous inquiry—two, whether there are instances where they're searching government devices pursuant to an internal investigation like harassment. That's another category where I think it makes a lot of sense to me that it would be used.
Now you get to the subsequent concerns around scope of search, and you will want to inquire as to scope of search. If there are concerns about scope of search, I would again ask you to revert back to us. It would be good to know if this is being used in other instances, any concerning instances, that don't involve investigations that on the face of it seem reasonable.
My last question.... You'll get back to us on a number of uses. On scope of search, as you look to privacy impact assessments and working with these agencies on privacy impact assessments, it would probably be good.... Let's take the concern that Mr. Barrett raised about the difference between a government device and the cloud—fair point. Now, your point back—rightly—is that one has a reasonable expectation in one's privacy, and one has different expectations of privacy in different material. One protects that reasonable expectation of privacy with the bounds of necessity and proportionality.
I would be very interested to know if departments, in the course of their investigations, have gone beyond the bounds of necessity and proportionality. Are they searching the cloud unnecessarily in the course of harassment investigations? Are they searching in health info? I mean, it's a theoretical concern of this committee. Did it actually happen?
If, based upon your investigation, you could come back to this committee with real concerns identified, it would be appreciated.
:
In the strategic statement that we published last week and that I will forward to the committee, we talked about three priorities.
The first is modernizing the office and maximizing its impact through new legislation. At the very least, if there is no new legislation, we will have to examine how to protect privacy as much as possible.
The second is technology. We have to get ahead of technology or at least keep up with the pace of developments. That is a big challenge because we can see that people are increasingly adopting it. People like technology, use it and see its benefits. So we have to make sure that their privacy is considered and protected.
The third is protecting children's privacy. This is a very challenging area. It impacts their mental health, their reputation and their data. So there is work to be done in this regard and we will be focusing on these areas.
Internationally, there is also the issue of protecting data that flows across borders.
In short, by focusing on technology and trying to anticipate its trends and uses, and by focusing on children and their privacy, we will be focusing on the future. That is why we will be focusing on these areas. We are also open to recommendations the committee might have for us on other matters.
For my part, I take a broad view: We want to make the most of innovation and technology for the many advantages they offer in multiple fields. Mr. Green talked about the use of technology in health care, and it can also be helpful in sports and music. It can be beneficial and we must not refuse everything. Yet I do not want Canadians to have to make a choice between the advantages of technology and maintaining their privacy. They should not have to make that choice and the burden should not be entirely on individuals. I want Canadians to feel and know that institutions are there to protect them and advise them.
:
Thank you, Mr. Villemure and Mr. Dufresne.
That brings today's meeting to a close.
Mr. Dufresne, on behalf of Canadians and the committee, I want to thank you for your testimony today on this very important matter.
Thank you also to Ms. Ives.
[English]
We have a couple of items to deal with.
First, I'll go to Mr. Barrett.
Mr. Barrett, you have an oral motion that you'd like to put before the committee. I understand that you've spoken to committee members, and they all are in agreement with it. If you would put the motion on the floor, then we could have some discussion, if need be, on it.
Go ahead, sir.
:
Mary Dawson passed on December 24, 2023. I want to share a couple of notable points about her and why this is so important. I appreciate colleagues' agreement for this to be before committee and to report it to the House.
She wasn't just the Ethics Commissioner. She was pretty remarkable. Her fingerprints are all over very important parts of our history, including her having drafted the Access to Information Act, the Privacy Act, the Canada Health Act, the Official Languages Act, the Competition Act, the Customs Act and the Young Offenders Act.
She was made a member of the Queen's Counsel in 1978 and became associate chief legislative counsel in the early 1980s. Aside from being the associate deputy minister of justice for nearly two decades, she was particularly proud of her constitutional work, including being the final drafter for the patriation package on the Constitution Act, 1982, and the Charter of Rights and Freedoms.
That is a very brief, incomplete and not fulsome summary of her impressive service as a public servant to our country and her work as commissioner in calling balls and strikes. I think the mark of a good Ethics Commissioner is one who makes members of all parties equally uncomfortable, and she did that well.
Canada was well served by her contributions, and I appreciate colleagues' consideration of the motion.
She is legend for sure.
Is there any other discussion? I assume we have consensus on the motion—
Mr. Matthew Green: There is unanimous consent—not just consent.
(Motion agreed to)
The Chair: Thank you for bringing that forward, Mr. Barrett.
[Translation]
We have to adopt one last motion. It pertains to the budget for our current study on the federal government's use of technological tools capable of extracting personal data from mobile devices and computers.
Mr. Villemure, if the committee wishes to adopt the budget, the amount is $16,500. That includes the expenses for the witnesses, video conferencing, the work report and other expenses.
Is it the pleasure of the committee to adopt this motion?
Voices: Agreed.
(Motion agreed to)