:
I call the meeting to order.
Good afternoon, everyone.
[Translation]
Welcome to meeting no. 87 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics.
[English]
Today's meeting is taking place in a hybrid format, pursuant to the Standing Orders. Members are attending in person in the room and remotely by using the Zoom application.
I would like to make a few comments before the business of the committee starts, for the benefit of witnesses and members.
Please wait until you are recognized by name before speaking.
For those participating by video conference, click on the microphone icon to activate your mike, and please mute yourself when you are not speaking.
Those on Zoom have the interpretation choices, at the bottom of their screens, of “floor”, “English” or “French”. Those in the room can use the earpiece and select the desired channel. Although the room is equipped with a powerful audio system, feedback events can occur. These can be extremely harmful to the interpreters and cause injuries.
I remind you that comments from members should be addressed through the chair.
[Translation]
Today, we have the same witnesses for two hours, to talk on two different topics.
For the first hour, pursuant to Standing Order 108(3)(h), we will receive a briefing on the annual report and other reports of the Privacy Commissioner.
Then, for the second hour, the committee will resume its study on the use of social media platforms.
[English]
I would now like to welcome our witnesses today.
From the Office of the Privacy Commissioner of Canada, we have Mr. Philippe Dufresne, Privacy Commissioner of Canada, and Mr. Michael Maguire, director, Personal Information Protection and Electronic Documents Act compliance directorate.
Welcome, gentlemen, to the committee.
Commissioner, you have five minutes to address the committee. Please go ahead, sir.
:
Good afternoon, Mr. Chair.
Good afternoon, members of the committee.
I am pleased to be here today to discuss my 2022‑23 Annual Report to Parliament, which highlights the important work that my office is doing to protect and promote the fundamental right to privacy in a time of unprecedented technological change.
It is encouraging to see this continued focus on the importance of privacy, as it impacts virtually all aspects of our lives.
Many of the public interest issues that you are seized with as parliamentarians—children's rights, online safety and cybersecurity, democratic rights, national security, equality rights, ethical corporate practices and the rule of law—all have privacy implications and, I would argue, all depend on strong privacy protections.
[English]
In this digital era, as you will see from some of the work and investigations my office has conducted this year, routine activities of daily life—for example, socializing online, using mobile apps, getting packages delivered or going to the checkout counter—can also raise privacy issues.
Since my appointment as Privacy Commissioner in June 2022, I've identified strategic priorities for my office that helped frame our work over the past year and that will guide the way ahead. These include addressing the privacy impacts of the fast-moving pace of technological advancements—especially in the world of artificial intelligence and generative AI—protecting children's privacy, and maximizing the OPC's impact in fully and effectively promoting and protecting the fundamental right to privacy.
[Translation]
To support these priorities, this past year we have engaged extensively with our domestic and international counterparts to identify and undertake collaborative opportunities.
We have also continued to advocate domestically for the modernization of Canada's privacy laws. I was honoured to appear before the Standing Committee on Industry and Technology last week in the context of their study of Bill , the digital charter implementation act, 2022, where I made 15 key recommendations needed to improve and strengthen the bill. I was pleased to see a number of them endorsed by Minister in the form of amendments that will be put forward to the committee, and I look forward to the work of Parliament in reviewing this important bill.
[English]
I will now turn to some of our compliance work from the last year.
We accepted 1,241 complaints under the Privacy Act, representing an increase of 37% over the previous year, and 454 under the Personal Information Protection and Electronic Documents Act, or PIPEDA, a 6% increase over the year before.
One of the public sector investigations highlighted in this year's report involved Canada Post's Smartmail marketing program. Our investigation revealed that Canada Post builds marketing lists with information gleaned from the envelopes and packages that it delivers to homes across Canada. It makes these lists available to advertisers for a fee. We found this contravened the Privacy Act, as it was done without the knowledge and consent of Canadians. We recommended that Canada Post stop its practice of using and disclosing personal information without first seeking authorization from Canadians. As a possible solution to remedy this matter, we recommended that Canada Post send a mail notice to Canadians to inform them of this practice and indicate an easy way for Canadians to opt out.
Until the tabling of my annual report, which made this decision public, Canada Post did not agree to implement this solution. After the report was made public, Canada Post issued a statement that it would review its policies. I expect Canada Post to comply with the Privacy Act and I look forward to hearing from them on the next steps to resolve this matter.
[Translation]
The report also highlights some of our private-sector investigations from last year, including our investigation of Home Depot's sharing of the personal information of customers who opted for an electronic receipt instead of the printed one at checkout with a social media company.
Home Depot has since stopped that practice and implemented my offices recommendations. This case underscored the importance of businesses obtaining meaningful consent to share customers' personal information.
Another important area of our work is addressing breaches in the public and private sectors.
We remain concerned about possible under-reporting of breach incidents in the public sector. The number of reported breaches fell by 36% to 298 last year, and only one of those reports involved a cyber-attack. This compares to 681 breach reports from the private sector, of which 278 were cyber-related.
[English]
We also engage in groundbreaking policy work, provide advice and guidance to organizations in both the public and private sectors on privacy matters of public interest and importance, and continue to provide advice to Parliament.
We know that privacy matters to Canadians more today than ever before and that they are concerned about the impact of technology on their privacy. Our latest survey of Canadians found that 93% have some level of concern about protecting their personal information and that half do not feel that they have enough information to understand the privacy implications of new technologies. This is why the work of my office to deliver concrete results that have meaningful impacts for Canadians and privacy in Canada is so important.
In closing, I would like to thank this committee for its work over the years, including the many reports and recommendations in the field of privacy. I cite them often. We certainly consider and consult them very often, and I know that Canadians do as well.
I look forward to continuing our efforts to ensure that privacy rights are respected and prioritized by government institutions and businesses alike, and to position Canada as a global leader on privacy.
I would now be happy to answer your questions.
:
Yes. Thanks very much, Mr. Chair.
We have a who broke Canada's ethics laws. The RCMP requested documents, and the Prime Minister used cabinet confidentiality to obstruct the release of those documents. This is an issue that we're going to have to revisit at this committee. It's of high importance to Canadians that they're able to have confidence in their democratic institutions, and no one is above the law, including the Prime Minister.
That said, Mr. Dufresne, I appreciate your opening comments, particularly with respect to the Crown corporation, Canada Post. I think that all Canadians expect it to follow the Privacy Act. I am heartened that following your investigation into Home Depot, they complied with your instruction. While I understand that Canada Post is reviewing the situation, it's very clear that they should also comply with your instruction.
Have you been made aware of instances in Canada of people's data being scraped and collected by foreign governments for nefarious purposes?
:
There are two things, Mr. Barrett.
In our statement dated August 24, 2023, we talked about some of the privacy risks in terms of data scraping. Some of them include targeted cyber-attacks, identity fraud, monitoring, profiling and surveilling of individuals, unauthorized political or intelligence-gathering purposes, or unwanted direct marketing or spam.
There are a number of risks, which is why we are calling on social media companies, and indeed all organizations, to respect privacy obligations. We set out a number of ways in terms of risk mitigation techniques that social media companies can and should take to protect that information from bad actors that would scrape the information.
We also, again, highlight some practices and advice to individuals, although it is not on individuals to protect themselves exclusively: The organizations have a duty, and there is advice that can be taken.
You made reference to TikTok. I initiated a commissioner-initiated complaint with respect to TikTok last year. We initiated this in February—this is a joint investigation—and I am moving forward with my provincial colleagues from Quebec, Alberta and British Columbia. We initiated that to look at the privacy practices. We are looking forward to completing this investigation, hopefully, by the end of March 2024.
Before I put my questions to Mr. Dufresne, I too would like to clear up a few things.
On Monday, I believe that the Chair abused his authority. I'd like to remind him of certain procedures and regulations that I believe were not followed.
[English]
You know that there are long-standing procedures and practices that govern the House of Commons standing committees. The process for undertaking subject matter studies, the process for moving motions and the role of the chair are outlined in the House of Commons Procedure and Practice. That is what I debated during the suggestion and motion to adjourn the meeting.
I will remind us that page 1061 of the third edition of House of Commons Procedure and Practice states:
A motion is needed to submit a proposal to a committee and obtain a decision on it. A motion is moved by a member to have the committee do something, order its Chair and staff to ensure that something is done (an order) or express an opinion on a matter (a resolution).
Page 1011 of the same edition states:
The committees then undertake to define the nature and scope of the study, to determine how much time they will devote to it and whether or not they will report their observations and recommendations to the House.
:
I'm almost done. Thank you very much, Chair.
Lastly, page 1039 states that the chair calls meetings and decides on the agenda for the meeting in compliance with instructions from the committee.
The process outlined above was not followed in the circumstances of the meeting scheduled for October 23, 2023. Therefore, I cannot wait for us to debate that motion, and then we will be able to resolve what happened last Monday.
[Translation]
Thank you for allowing me to share my thoughts with you as well.
Having said that, thank you very much for being with us today, Mr. Dufresne. I'm happy to see you in person and to have the privilege of congratulating you on your appointment to this position. I know you were appointed some time ago, but I'm very happy to see you in this position.
You mentioned that you had a backlog of complaints that needed to be dealt with, and that it was starting to put a strain on your resources.
What course of action or approach are you thinking of taking? From what I understand, your organization's work is becoming increasingly complex, particularly in terms of automation.
I'd like you to tell us about the complexity.
:
Thank you very much, Ms. Fortier. I'm happy to see you in person as well.
We addressed this issue early in my term, because it's important that we make quality decisions, but how fast we make them is equally important. Decisions must be delivered within a reasonable timeframe. However, when too many requests are received, it takes longer to respond. We've therefore identified a need and obtained additional resources from Parliament. We're grateful for that.
We're looking at this issue from all angles. We're reviewing our internal processes to determine whether we can operate in a more agile way, whether we're adequately managing risk, whether we can use other technologies, and whether we can use incentives to encourage organizations to resolve disputes more rapidly, for example. I'm a big believer in voluntary dispute resolution.
To improve efficiency at the Office of the Commissioner, I've had a lot of discussions with industry and government representatives to understand the barriers and benefits. One thing I'd like to do is recognize the government's or industry's good work when it comes to privacy, not just their shortfalls, to encourage them to continue moving in the right direction.
There are many opportunities to improve efficiency at the Office of the Commissioner, but it certainly remains one of the main challenges. That's why our efficiency is one of my three strategic priorities, along with technology and protecting children's privacy.
We're really going to do everything we can to improve the way we operate. We've already started to see an improvement.
:
Yes, for the moment, privacy breaches in the public sector are reported in accordance with Treasury Board directives. There's a legal obligation in the private sector. We definitely have recommendations on the subject. I think it's useful to have binding legal obligations because that encourages organizations to take action. We need them in both the public and private sectors.
However, I also think it's a matter of understanding and communication. You have to understand the criterion for reporting privacy breaches. Sometimes organizations acting in good faith have a poor understanding of that criterion or else underestimate the risk of harm.
We saw this in some of our investigations this year. Some organizations indicated that they hadn't reported a privacy breach because they thought the risk of harm wasn't high enough. In some cases, we disagreed and determined that there had been a risk of financial harm, reputational harm or disclosure of sensitive information.
Consequently, we have some work to do to increase awareness, and we have to make sure we have the necessary tools for that purpose. However, we will continue working on this and encourage organizations to look into these issues. When they report breaches to us, we can offer them opinions and advice and work with them. That's really our objective.
We also work with citizens because we have to find solutions to protect the victims of those breaches.
:
One of the things I think about is how to communicate with citizens more effectively. What you've done in your riding with your seminars and discussions is very good, and it's a good step in the right direction.
We have to get to a point where Canadians understand what's happening, and we have to equip them to do that. Technology advances very quickly. You can see that with generative artificial intelligence, and other technologies will emerge. Sometimes Canadians may feel confused about it all because everything changes so quickly.
What can we do about it? Sometimes I hear people say it's too late to protect privacy because everything's moving too fast, and they give up. If there's one thing that I consider a concern, it's that.
I think it's important to tell people that we have to protect privacy, that it's possible to do so, that institutions can do it and that people can do it as well. Statutes will never be amended as quickly as technology evolves. The same is true of the regulations the government makes.
However, we need to pass legislation based on principles that can be applied to new technology. I'm a real believer in privacy risk assessments and in making them an obligation. I'm a true believer in transparency and in communicating more and more effectively with Canadians about what can be done with their information and how it will be used.
Consent provisions are often very hard to understand, even for experts. Consequently, people grow tired of it all. In the investigations I discuss in my report, whether they concern Canada Post, Home Depot or Tim Hortons, people are sometimes surprised by what's done with their information.
In our discussions with organizations, we asked them to be proactive and to make that information readily accessible. Sometimes their response is that their information is provided in the privacy policy on their website or at the post office. Then we tell them that they're asking Canadians to bear the burden of searching for that information when those organizations are in a better position to communicate it than they are.
:
I think we have to hold public discussions, be transparent and have obligations to be transparent.
The phenomenon you're describing has accelerated even more with artificial intelligence. We may think we know our personal information will be used by such and such an entity. However, do we really know what anyone can conclude about us based on that information? What inferences can be drawn? Sometimes postal codes or tastes in music, for example, can help someone deduce a person's sexual orientation, income level and so on. People don't know all that.
I recommended that Bill provide for a transparency obligation so that, when people reached a decision with the help of artificial intelligence, they could request an explanation in every case. However, the current version of the bill provides that a general account may be provided only in cases that would have a significant impact on the individuals concerned. I recommended that part be deleted because, for the moment, I think it's better to encourage more transparency rather than less.
We have to try to find pleasant ways to explain this. One of my mandates is to try to acquire tools. We provide a lot of information on our website, and we try to explain it all as best we can, but I think we can do better.
We also have to talk about children, because I think the message has to be adapted to suit the audience.
:
Thank you very much, Mr. Chair.
I'd like to thank Mr. Dufresne for taking part in our important study.
Mr. Dufresne, I learned something recently that really struck me.
You just said that updates to laws and regulations will never be able to keep up with technological advances. We agree on that.
Two years ago, the Department of Justice conducted an online public consultation on the modernization of the Privacy Act. However, there have been no major updates to the act since it was adopted in 1983. In 1983, when I was 10, we used floppy disks and I watched movies on VHS tapes.
In your opinion, how urgent is it to modernize the Privacy Act?
:
I think it's absolutely essential to modernize this act. We also need to modernize the part of the Privacy Act that deals with the private sector. This law is 20 years old, so it's older than Facebook and social media. It is positive that Bill aims to modernize the act with respect to the private sector. I look forward to seeing this bill move forward.
In addition, I hope that a bill to modernize the act for the public sector will soon follow. The had said, when Bill C‑27 was tabled, that the public sector privacy bill would follow. Consultations were held with first nations and indigenous peoples on certain implications. The Department of Justice published a report on these consultations—I believe it was in September. The work is ongoing. In my opinion, the solution is to move forward with Bill C‑27. The model passed in this legislation can then be adapted to the public sector, as needed. That could be beneficial.
Among our proposals, we suggest that there should be an increasing number of public-private partnerships and that the government should work hand in hand with the industry. At present, we have two laws with different requirements for government and the private sector. This is not optimal, and it creates problems in terms of interoperability. I entirely agree with you that this is becoming important.
In the meantime, the law applies, and our office will continue to implement it to the best of our ability. In fact, this is a message that my counterparts from the G7 countries and I conveyed when we were in Tokyo last summer. At that meeting, we talked about artificial intelligence. To address people's concerns, we said we needed laws on artificial intelligence. There are already some—privacy laws, for instance. They exist and they are enforced.
I've also launched an investigation into ChatGPT, to confirm whether or not it is compliant with the legislation. Tools do exist, but they absolutely must be modernized. We will be there to support Parliament.
:
Certainly. There are sometimes discussions about that.
As for the cases you're referring to, sometimes a department tells us it's already doing what we recommend. In the case of the pandemic, we also carried out an assessment of proportionality and necessity, which is not mandatory under the Privacy Act, but which we feel should be. We put forward that analysis.
It's a dialogue. We are always given the reasons for refusal, and dialogue is established.
Some breaches are more serious than others. The really worrying situations are those where there has actually been a major breach or a major consequence, combined with a complete refusal to follow our recommendation. That can undermine trust.
I feel the power to issue orders is important. When an officer of Parliament makes a recommendation to an organization and the latter refuses to implement it, the situation is not satisfactory. I believe there must be sufficient justifications given. If we had the power to issue orders, this wouldn't be a problem. We'd issue them when necessary. With that said, in my opinion, they should only ever be used exceptionally.
The same applies to fines. In Bill , we would add the possibility of imposing significant financial penalties on organizations. I think this is very important, for the same reason again: to create incentives. The idea is not to use them often, but...
:
Yes, Quebec's law 25 definitely has more teeth than existing federal laws, simply because it grants the power to issue orders. Quebec's access to information authority, the Commission d'accès à l'information, or CAI, can issue binding orders and impose heavy fines, similar to the European model under the General Data Protection Regulation. That makes it a more robust piece of legislation on that front. It lays out proactive obligations.
Hopefully, Bill will make its way successfully through Parliament and bring federal laws more up to date in that regard. It's not exactly the same as law 25, but it comes close with the power to issue orders, and to impose fines as well as proactive obligations on companies. I think it's a good model, following in the footsteps of Europe and Quebec. I think, federally, we can get there.
To answer your question about working with the CAI, I can report that we do indeed work very closely with Quebec and all the provinces and territories.
I was in Quebec City in September for the annual gathering of federal, provincial and territorial privacy commissioners, which the CAI hosted. We had some very important and useful conversations. We put out two resolutions, including on the protection of young people's privacy. They are joint statements reflecting principles that all the commissioners have agreed upon, despite the legislative differences between the jurisdictions. In this way, the commissioners are trying to make things easier for companies by flagging common elements across the different regimes. My office carries out joint investigations with provinces that have regimes similar to the federal government's, so Quebec, Alberta and British Columbia. We worked together on the investigations into TikTok, ChatGPT and Tim Hortons.
Our collaborative work is not only extensive, but also very useful. We are able to make sure that we are on the same page across the country.
:
Certainly, and thank you for the question. We talked about that in the annual report.
There is a very strong active domestic—federal-provincial-territorial—Canadian community, but also internationally there are a number of groups. I've been very active with the G7 round table of data protection authorities. Data protection authorities are essentially privacy commissioners from the G7 countries. We meet annually. We met a year and a half ago in Bonn, Germany. This year we met in Tokyo. One of the key themes of that group has been that we need to have cross-border data flows to ensure that we can have strong international trade when data is travelling from jurisdiction to jurisdiction. How do you ensure that it's protected and safe?
There are number of tools—legislative tools, contractual programs and so on. We have discussions on that. AI has been a growing topic. Last June in Tokyo we issued a statement about our expectations. I think it was one of the first statements in which privacy commissioners set out our expectations for AI from a privacy perspective. We said, for one thing, that current laws apply. Privacy law applies. It's not a legal void. We already have protections and we are going to apply them. We stated our expectation that organizations have privacy by design, that they have privacy impact assessments when developing these tools, and that they do this.
It was a call to action. I was happy to see, in the industry department's voluntary code of practice for AI that was launched a couple of weeks ago, that the G7 declaration was highlighted, as was a reminder that the Privacy Act continues to be important.
Privacy laws apply to personal information, information that can identify us, because it has to be protected. That information can be used to draw a number of conclusions about us. The law sets out an exception: public information is not subject to certain obligations. Nevertheless, the exception has a very narrow definition. It has to be prescribed by regulation, and it's very limited.
Generally speaking, information that is online is public. Personal information, however, is still personal information. That means organizations are not allowed to use the information however they wish. They have to adhere to the applicable principles. That is the reason why we have investigated organizations that used excessive means to collect photos online to build facial recognition databases and, then, tried to sell them to police.
First, we conducted an investigation into the company Clearview AI, and we found that the database went way too far. There was no framework of restrictions, and the company did not set parameters with respect to necessity, proportionality and so forth.
Second, we conducted an investigation and tabled a special report to Parliament on the RCMP's use of the company at the time.
We found that the RCMP violated the act by using the company and failed to meet its own obligations. The RCMP has since stopped using the company and initiated the national technology onboarding program.
That is a very clear example of how information that appears online is still considered personal information.
Under the Privacy Act, the public sector's obligations are less stringent than the private sector's. Departments are required to show that the information is used for purposes related to their respective mandates. For example, they have to show that they have a legal mandate to do X, so they can do it.
Some obligations are more specific, like those at issue in the Canada Post case. When an organization uses information indirectly, the obligation threshold is greater. It has to ask for permission. The first major consideration when a public organization uses information is whether the activity is relevant to its mandate.
We think it's important to impose the obligations of necessity and proportionality, in keeping with international principles and practices in the private sector. The idea is to consider what information the organization is collecting and for what purpose. It's a bit similar to how it works for charter human rights. Is the organization's purpose important enough? Will the measure achieve the purpose? Has the organization done everything possible to minimize the use of the information in achieving its purpose?
We underscored those principles in our report on the pandemic, and we apply them. While we realize they aren't binding, we apply them and use them to inform our recommendations. We've been able to draw some useful lessons. On the whole, the government adheres to the principles. Occasionally, we're of the view that there should have been more information on how the organization assessed the discarded options, but that, on balance, its decision was justifiable.
It's a standard that encourages decision-makers to ask questions about what they're doing and whether they are minimizing the risks. That's more or less what we are asking.
One of my major recommendations for Bill is to require organizations to conduct audits and privacy impact assessments, or PIAs. It's about considering what the risks are and which measures can minimize them.
PIAs are good for privacy, and they're good for Canadians.
:
There are three we made specifically on AI that would help that issue. One was mandating privacy impact assessments whenever you have a high-impact system of AI. That would be one. Doing that, as an organization you would need to ask what the risk to privacy is. What is the risk of these types of deepfakes? How are you mitigating that? There are some proposed provisions in the AIDA, the artificial intelligence data act, that would do that as well.
We recommended great transparency for AI decisions. If a decision is made about you, you can ask for an explanation. If you see something that's strange, like a video of you, and you ask that question, you should get that explanation.
We also recommended collaboration among regulators wherever we can. I've just launched, with the Competition Bureau and the CRTC chair, a digital regulators forum, but there are limits on what we can do. We can't collaborate in investigations, for example. I can do that with the FTC in the U.S. and other countries, but I can't do it in Canada. That's a gap that would be easily fixed, and, in my view, it should be fixed.
:
I would say a few things. One is that we've issued a declaration with my federal, provincial and territorial colleagues called “Putting best interests of young people at the forefront of privacy and access to personal information”. It's available on our website. We give a number of recommendations and expectations for organizations about making sure that they're protecting children and the best interests of the child and that they're treating their information appropriately.
In terms of what people should do—and that's something we've said in our data-scraping statement with my international colleagues—ask yourself if you are comfortable sharing this much information. Do you know enough about the settings and the protections that are there? Is this something you want to potentially see forever?
In Bill , there's a new proposed section to dispose of information, especially for minors. That's good, but whenever you're putting a picture of your children online, ask yourself if you want to take the risk. Have you put the privacy settings in a strong enough way? Are you sharing this with the whole world? If you don't understand enough about what the organization is doing and you find its privacy policy to be complex, I always encourage everyone to ask the organization.
Ask for more information. When stores ask for your birthday, ask them why they want to know your birthday when you're buying jewellery or any kind of item. Why do they need that information?
It's getting that reflex of not just saying, “Yes, sure, I'll give it to you.”
:
Thank you, Mr. Kelloway.
It's always solid, Mr. Dufresne. I appreciate that.
That concludes our first hour. What I would like to do is roll right into the next hour and give Mr. Dufresne a second to get his notes together.
I want to make the committee aware that I had a request from TikTok to extend by a week the requirement to provide us with written responses to the written questions. If the committee recalls, it was supposed to be this Friday. They've asked to have until next Friday.
With the committee's consent, I'd like to give them that extension so that we get the answers we need. Is that okay?
Some hon. members: Agreed.
The Chair: We had a lot of crosstalk about TikTok, so we're going to move into our second hour, which is our social media study focused on TikTok.
Mr. Dufresne, if you'd like to address the committee for five minutes, I'd appreciate that. We'll then get into questioning.
Thank you. Go ahead.
I'm pleased to now turn to this part of the discussion. I thank the committee for its interest in the ways that social media platforms such as TikTok harvest, handle and share personal information.
The online world brings with it a host of possibilities for innovation and connection, but it also carries potential for significant harm, especially for young people.
As you know, my office, along with our counterparts in Quebec, British Columbia and Alberta, launched an investigation into TikTok in February. We are examining whether TikTok's practices comply with Canadian privacy legislation, and in particular whether it obtains valid and meaningful consent for the collection, use and disclosure of personal information.
We are also looking at whether a reasonable person would consider the purposes for which it handles personal information, in particular children's information, to be appropriate in the circumstances.
[Translation]
This matter is a high priority for my office, especially given the importance of protecting the fundamental right to privacy of young people, who represent a notable proportion of TikTok users. As a result of the ongoing investigation, there are limits to my ability to speak publicly about the company’s practices at the moment.
For that reason, I will focus my remarks today on the privacy principles that underpin my office’s approach to the digital world from the perspective of the privacy rights of children.
Growing up in the digital age presents significant new challenges for the privacy of young people. As children and youth embrace new technologies and experience much of their lives online, we need strong safeguards to protect their personal information, and how it may be collected, used and disclosed. Increasingly, their information is being used to create personalized content and advertising profiles that are ultimately aimed at influencing their behaviours.
[English]
Children have a right to be children, even in the digital world. As UNICEF notes in its policy guidance on artificial intelligence for children, young people are affected by digital technologies to a greater extent than adults. Young people are also less able to understand and appreciate the long-term implications of consenting to their data collection. Privacy laws should recognize the rights of the child and the right to be a child. This means interpreting the privacy provisions in the legislation in a way that is consistent with the best interests of the child.
I'm encouraged by statements from the indicating that there is a desire to strengthen children's privacy rights in Bill . My office has recommended that the preamble of the modernized federal privacy law should recognize that the processing of personal data should respect children's privacy and the best interests of the child. I believe that this would encourage organizations to build privacy for children into their products and services by design and by default. I was pleased to hear the minister signalling his agreement with that recommendation.
[Translation]
The law must have strong safeguards to protect children’s information from unauthorized access, and reflect greater consideration of the appropriateness of collecting, using and disclosing their information.
Earlier this month, my provincial and territorial colleagues and I adopted a resolution calling on organizations in the private and public sectors to put the best interests of young people first by, among other things, providing privacy tools and consent mechanisms that are appropriate for young people and their maturity level; rejecting the kind of deceptive practices that influence young people to make poor privacy decisions or to engage in harmful behaviours; and allowing for the deletion and de‑indexing of information that was collected when users were children.
I am happy to see this was included in Bill .
[English]
In closing, it's critical that government and organizations take action to ensure that young people can benefit from technology and be active online without the risk of being targeted, manipulated or harmed as a result. I expect that the findings from our investigation into TikTok will be informative not just for that company but also for other organizations that collect and handle children’s sensitive personal information.
I also look forward to seeing Bill progress through the legislative process in a way that will provide children and minors with the privacy protections that they need in this increasingly digital world.
With that, I will be happy to take your questions.
:
Thank you very much, Mr. Chair.
Thank you, Mr. Dufresne, for being with us.
I'll take a brief step back in time. In the last century, when people bought a newspaper in the morning, they felt they were buying a product that gave them news, an account of what was happening in their society. But then someone thought about it and said that the newspaper was actually selling the reader to the person who was buying advertising in the paper. The buyer wasn't who we thought. In those days, a company that owned a lingerie store, for example, and bought advertising in the newspaper generally had no idea who its customers were or even who the newspaper's readers were.
Today, however, with social media, the dynamic is completely different. Indeed, people almost voluntarily provide their personal information to large conglomerates that sell or share this information in order to make huge profits. In other words, citizens provide information free of charge, enabling these large companies to target advertising precisely to their wants, desires and needs. At the same time, this enables large companies, large conglomerates, to reap considerable profits.
You spoke earlier of a consumer or citizen reflex. Do you get the impression that most people around us understand that they are selling themselves for free to the Web giants and social media?
:
Thank you, Mr. Dufresne.
In one of his sets, which I won't redo here, Louis-José Houde says that just because he buys a spatula doesn't mean we'll have access to his cell phone.
On a more serious note, in 2019 you did an investigation into Facebook, which is now called Meta. The report included four major findings, which were quite disturbing, namely that Facebook failed to obtain valid consent from not only users, but also from those users' friends or contacts. In addition, measures to protect users' personal information were inadequate. These four findings are quite worrying, given the popularity of this platform.
Since you carried out this investigation and came to these four conclusions, what exactly has happened?
:
Thank you, Mr. Boulerice and Mr. Dufresne.
[English]
Before we proceed to the next round, I have had a request. I want you to consider this request. I'm not asking for consideration now, but I will ask at the end of the meeting.
As I mentioned at the onset of this particular round, there was some discussion about TikTok in the last hour. The request is to have some of that information extracted and placed into this report, which is relevant to our study on TikTok.
I see that Mr. Barrett is not here. He has stepped out for a second. That's why I want to give you a bit of a notice to consider this.
Ms. Gladu, you have five minutes. Go ahead, please.
:
It's a problem for kids because of their greater vulnerability. We've made a number of recommendations in terms of making sure that we're not using these behavioural techniques of nudging. We shouldn't be nudging individuals generally, but certainly not children, into making bad decisions and making bad privacy decisions. There needs to be work on that.
There have been reports on social media being addictive and being addictive for children generally. Sometimes the business model is to try to encourage them to stay longer, because that's what generates more revenues. That has to be taken into consideration with children who have been online more and more during the pandemic, and since then with school. I've seen it and parents have seen it.
We need to adjust to this new reality as parents, children and society as a whole, so that there's a greater awareness of what this means and what their rights are.
Bill proposes a right to disposal. That's informing.... When I say that children have a right to be children, that's what I'm alluding to. Children do things online. If it stays online forever, then they're treated as adults right from when they're teenagers. It stays forever, and it could be used against them for jobs and so on and so forth.
We need to deal with this. Bill will deal with it to some extent, but we certainly need to build greater awareness of it as we are living more and more in a digital world. It brings innovation and it brings great things, but we need to be well equipped to deal with it and we need to learn about it. I would hope to see mandatory training in schools early on, so that individuals can get the tools early on.
We'll get these reflexes. We're going to ask questions. We're going to ask why they need this information. We're going to learn to see what a good privacy policy is, and if it's not, we're going to learn how to complain about it so that it could become a good privacy policy in the future.
That way, we're creating ambassadors for privacy everywhere.
:
Yes, a revision of the two laws is necessary. One is under way for the law in the private sector. This is Bill . This also includes a specific component for artificial intelligence.
A revision is necessary because the law is 20 years old. It's older than social media. We're still applying it, the principles are there, but technology is advancing rapidly. In my opinion, this calls for stronger proactive obligations, for example. We need to force organizations to make basic assessments that they have to disclose to our office; we also need to impose greater transparency, particularly when it comes to artificial intelligence.
The law governing the public sector, on the other hand, is even older. It dates back 40 years. It needs to be modernized and strengthened, because when it was passed, it was really at a time when the impact of data was not what it is today.
:
Fortunately, the law is based on principles. So we're able to apply those principles to organizations that use and disclose data. That's what allows us to investigate TikTok and ChatGPT.
That said, there are shortcomings: we don't have the power to issue orders or fines.
In the case of organizations making huge profits from data, there is a shortcoming. It may not have been an issue before because companies weren't making so much money from data, but, now, they are.
So there have to be fines. We need to be more proactive. We need greater transparency. Explaining decisions made by algorithms, by artificial intelligence, obviously wasn't a problem. We can regulate this with principles, but there are certain things that become a little more technical. I think that, when it comes to artificial intelligence and algorithmic decisions, our requirements need to be broad enough that they still apply five years from now, ten years from now, to ChatGPT's successors. These requirements must be reinforced.
This raises the whole question of online age verification and techniques for determining whether a person is underage or not. This will be important in the context of Bill , which explicitly grants rights and treats information differently. It's an issue we're looking at, in the privacy field. There's a lot of discussion about it. In fact, the Information Commissioner's Office of the U.K. has issued guidelines on verification tools.
What we're saying, at the Office of the Privacy Commissioner, is that these tools need to be appropriate and not ask for too much personal information. Age verification needs to be managed, but we don't necessarily want to ask for too much personal information to do that. That said, there are ways of doing it and technologies to do it. It's another area where we need to be creative.
Also, it has to be context-appropriate. Some sites may be higher-risk and will require tighter verification. We can think of gambling or pornography sites, for example. Some sites may be less sensitive. Others may be aimed specifically at children. There may be a presumption.
I think this will be part of the implementation of this law. My office will have a role to play in this as it can issue guidelines.
In addition, the bill also provides for the creation of codes of practice and certification programs.
This will encourage organizations to adhere to a series of rules. If they respect them, it will have an effect on the complaints process, which will be beneficial for these organizations. So it will be one more tool. I suspect that the Office of the Privacy Commissioner will be able to work on it, precisely to give these details.
The Office of the Privacy Commissioner also has an advisory mandate. Companies, especially small and medium-sized enterprises, can contact us for answers to specific questions. We're here to help them with questions like these, especially those of a more technical nature.
Thank you, Mr. Dufresne and Mr. Boulerice.
[English]
Before we go, first of all, I want to thank you, Mr. Dufresne, for being here today. I know a couple of hours is a long time. You were solid, as always.
Mr. Maguire, thank you for being here. You were solid in support.
On behalf of Canadians, I want to thank you for your service to our nation.
My understanding is—I caught up with Mr. Barrett—that there is consent to extract the first hour of TikTok information to put into this study, so we will do that and we will make sure that the analysts do that.
That's it for today.
I want to thank everyone. Thank you to our clerk, our analysts and our technicians.
The meeting is adjourned.