Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.
Welcome to the 144th meeting of the Standing Committee on Access to Information, Privacy and Ethics.
Pursuant to Standing Order 108(3)(h) and the motion adopted by the committee on Thursday, November 21, 2024, the committee will begin its study of the liquidation of TikTok Technology Canada Inc.
[English]
I'd like to welcome our witnesses for the first hour today.
From the Office of the Privacy Commissioner of Canada, we have Privacy Commissioner Philippe Dufresne.
Welcome back, Mr. Dufresne. I hope you went home and were able to come back. We had you here on Thursday.
Also from the Office of the Privacy Commissioner of Canada, we have Marc Chénier, deputy commissioner and senior general counsel.
Just before we begin with opening comments, we do have a study budget for this. I'd like to get this adopted now. It's $1,750 for this study.
Is there any objection to the study budget from members of the committee?
Good afternoon, Mr. Chair and members of the committee.
Thank you for inviting me to appear as part of your study on the government's decision to order the wind‑up of TikTok Technology Canada Inc.
I'm pleased to be able to contribute to this important discussion on national security and privacy with respect to foreign influence on digital platforms, including social media, and foreign ownership or control in that context.
[English]
With respect to the government's decision to order the windup of the Canadian business carried on by TikTok Technology Canada, Inc., this decision was made pursuant to the Investment Canada Act, which allows for the review of foreign investments that may be injurious to Canada’s national security.
According to the guidelines on the national security review of investments, these reviews may look at a number of factors, including whether a foreign investment could facilitate access to sensitive personal data, including personally identifiable health or genetic information; biometric information; financial information; private communications; geolocation; or personal data concerning government officials.
This assessment is made by the government. My office was not involved or consulted with respect to this assessment. Indeed, we learned of it when it was announced publicly on November 6, 2024.
(1630)
[Translation]
As you know, in February 2023, I launched an investigation into the TikTok social media platform with my counterparts from Quebec, British Columbia and Alberta.
We will determine whether TikTok's practices comply with Canadian privacy laws and, more specifically, whether TikTok has obtained informed consent for the collection, use and disclosure of personal information.
Given the importance of protecting children's privacy, the joint investigation focuses on TikTok's privacy practices for young users.
[English]
I expect that the findings from our investigation into TikTok will be informative not just for that company but also for other organizations that collect and handle children’s sensitive personal information. The government’s decision to order the windup of TikTok Technology Canada, Inc., does not impact my authority to investigate. We are nearing the end of this investigation. My goal is to have it concluded in the next few months. As the investigation is ongoing, I am limited as to what else I can share at this time.
Championing children's privacy, addressing and advocating for privacy in this time of technological change, and maximizing our impact are my three strategic priorities for the Office of the Privacy Commissioner. To this end, in October of this year I issued a statement with my G7 counterparts on artificial intelligence and children. The statement highlights the importance of ensuring that such technologies as AI be developed in a manner consistent with the best interest of the child.
[Translation]
To achieve this important balance between innovation and the fundamental right to privacy, territories must work together so that citizens can actively participate in the digital world knowing that their fundamental right to privacy is protected.
To achieve this, we are setting parameters that will help organizations innovate while fostering a culture where privacy principles by design and default are embedded in their core business.
In closing, I would like to thank and congratulate the committee for its most recent report, released last Thursday, entitled “Oversight of Social Media Platforms: Ensuring Privacy and Safety Online.”
[English]
I fully support your recommendations that are good for privacy, for Canadians, for the public interest and for innovation. I look forward to sharing and discussing these important insights with my counterparts in Canada and internationally.
Thank you again. I'd be happy to answer your questions.
Before we commence with Mr. Barrett, I have Mr. Green online.
Mr. Green, we went through a study budget here of up to $1,750. It doesn't mean we're going to spend it all. Whatever we don't spend gets put back to the House. That's for working meals and headsets. We may or may not need them all.
Are you okay with that?
I have the consent of the committee, Madam Clerk, so the study budget is approved.
We are investigating the transparency of the consent practices of TikTok. The investigation is under way, so I can't speak about the substance of that investigation. When we announced it, we did indicate that it would have a specific focus on children's privacy and younger users.
We are working hard, with our colleagues from Quebec, British Columbia and Alberta, to complete this as soon as possible. I hope to do so, as I said, in the next few months.
I think that there's a bit of a problem for Canadians who are trying to make an informed decision for themselves, for parents who are trying to parent their children and for children who are trying to make informed decisions for themselves.
We have a government that has said that this app is okay to continue to use. However, the offices of this business that operates this platform must be closed. There are specific risks, but the government can't tell us what they are. It's a complete absence of transparency.
You're an independent officer of Parliament, so you serve a function that's important to help keep Canadians informed and to help be a check against some of what government does.
Should the government have been more transparent with Canadians on a matter that deals with the protection of Canadians' privacy?
Should the government have aligned its decision and announcements with the release of your report?
The government's decision, which was announced on November 6 by the Minister of Innovation, indicates that this was made under the Investment Canada Act. It's made for national security reasons.
Our investigation is a different track. Our investigation is looking at the privacy of Canadians, and children in particular. We're moving that forward, but the national security component is a separate aspect. I can't speak to that decision and to the reasons behind it. I wasn't part of it. I wasn't consulted on it.
What I can say is that we—me and my colleagues from the provinces and territory—have issued a statement on the privacy of children. We make a number of points in there setting out our expectations for organizations, giving tips, highlighting certain things for parents and children, making sure that the privacy protections are highest, and calling out organizations when their practices aren't clear enough. We're going to continue to use the tools that we have, which are the promotion and the investigation power. We look forward to completing that.
The government has ordered the closure of the TikTok offices in Canada.
Whom are you going to investigate in the future, if there's an issue raised with respect to TikTok, if they have no presence in Canada and if they terminate...as has been the ordered by the government, for very opaque reasons?
We have no metric or measure to be able to judge whether those are good reasons or not, but they've ordered the closure of these offices.
Whom would you investigate and you ask documents from?
Would you ask foreign entities? Would those foreign entities have any obligation whatsoever to participate or co-operate with your investigations?
Canadian law will apply to a matter if it impacts Canadian users. For privacy law in Canada to apply, it's not necessary for an organization to have offices in Canada, or for it to have generated in Canada. The courts have recognized that if there's a real and substantial connection to Canada, and if Canadian users can be impacted by this, we have jurisdiction.
As to how we investigate those things, if the organization doesn't have anyone in Canada, we would reach out to the organization in another country.
We don't have any enforcement powers in other countries. We don't have the ability to force an organization to collaborate with us or provide us with information if they're outside Canada.
It seems a bit problematic, then. You might see what I'm driving at here. If we have no entity in this country for you to collaborate with, it creates a problem.
It just seems odd that the government would say, “This is so dangerous that we have to close the office, but it's okay for children to continue using the app.” I guess what I'm driving at here is that they need to pick a lane. Either it's safe for Canadians or it's not. We just don't know.
Well, I'm focused on the investigation we're pursuing. We're interacting on this with TikTok. We're asking the questions. We're going to be publishing our report, as I said, as soon as we can. That will contain our conclusions with regard to their compliance with Canadian privacy law.
Thank you to the witnesses who are appearing today.
Thank you, Commissioner.
I'll start by asking this: Is it your understanding that the government's decision under the Investment Canada Act review relates entirely to TikTok's business operations, not the app itself?
I haven't seen any order from the government banning the app in Canada, or preventing its use. It's calling on the organization to cease its corporate activities in Canada.
No, we're not involved in that decision. This is a decision made by, I believe, the Department of Industry in consultation with national security experts in the government. It is ultimately a decision of the executive.
We were not consulted in the context of this specific decision. We could be consulted on a general-operation, privacy impact assessment of the program itself, but this decision is not one that involves me. It's not a decision where I would have been able to provide privacy input.
You and I are both of a generation where we experienced life without the Internet, then experienced life with the Internet and digital technologies. I think everybody around this table is of those generations. Well, maybe Mr. Caputo isn't. Oh, oh!
Voices: Oh, oh!
Ms. Iqra Khalid: There have been concerns that TikTok's leaving their operations on Canadian shores will heavily handicap your oversight of the app and any data violations. However, it's my understanding that your office can assert jurisdiction over foreign companies without their having a physical presence in Canada. The physical location of servers is not determinative of your office's capacity to do its job.
That's correct in the sense that, for me to have jurisdiction over a matter, there needs to be what the courts have called “a real and substantial connection to Canada”. That could be by looking at things like the location of the target audience of the website. If there are Canadian users, then that factor is met. We also look at the source of the content of the website, the location of the website operator and the location of the host server.
It is a contextual assessment, but we usually look at jurisdiction when Canadians are affected by this, and if Canadians are using the app, then we take jurisdiction.
Where it can have some more challenges is, if there's time to enforce the decision, there's time to compel things, and if things are in another country, then we need to use the courts of that other country.
We still have jurisdiction to investigate the website, to investigate the impacts on Canadians, to investigate whether it complies with privacy practices. We have jurisdiction to make an order and to seek a court order from Canadian courts.
Where the issue could come up in terms of enforcement is if all the assets are in another country. It then becomes an issue of private international law where you seek to have another court of another country enforce a decision of Canadian courts.
In this instance, we are still conducting our investigation. We're going to be issuing our report of findings in the coming months with my colleagues from Quebec, B.C. and Alberta, so I won't say any more about that. We haven't finalized the outcome. I don't have order-making powers, but in my case, the outcome could be recommendations. I could take action before Canadian courts. My colleagues in the province have order-making powers, so that's an issue that would come up at a later date.
Is it something that the RCMP would get involved in once your investigation is completed, or would that be a civil case that you think would need to endure?
Our investigation would not involve the RCMP. We conduct our investigation, and we issue our report. Then, if there is a finding that the law was breached, if there are recommendations or provincial orders, or if we take the matter forward to court, then it becomes a civil proceeding.
Are there any recommendations that you have to make to this committee with respect to how we should conduct ourselves, not just with how TikTok has operated with this move but also with other social media platforms? Is there anything that you think we can be doing to better legislate protective supports for our communities and our country?
We're over our time here. I'm going to have to get you, Mr. Dufresne, to circle back on that response, if you don't mind, in the next round of Liberal questions. We're 28 seconds over time on that one.
[Translation]
Mr. Villemure, you have the floor for six minutes.
Thank you for being here, Commissioner and Mr. Chénier.
I think we're dealing with a paradoxical injunction, where we're being asked to do one thing and the opposite. I won't go into the details that you've already presented to my colleagues, but I'd like to know what you think of the Canadian government's decision to expel TikTok, while allowing its use. It seems paradoxical to me.
I can hardly comment on this decision, because I don't have the information on which the government based its decision.
The minister's statement says that this decision was made under the Investment Canada Act on national security grounds, and it explains what it does and doesn't do. I can't judge that. All I can say is that, in parallel to this, we're already conducting an investigation into TikTok to determine whether the Personal Information Protection and Electronic Documents Act is being complied with, particularly as regards young users, and that's what we're going to do.
I'm not asking you to reveal anything about your ongoing investigation, but it seems that this decision, which is based on the Investment Canada Act, doesn't help you. It seems to make your job a little more difficult.
At this point, as I indicated, we're continuing our investigation. It doesn't affect our investigation. We'll be able to complete it in the coming months and make our decision.
Earlier, with your colleagues, we discussed what would happen if the company refused to cooperate or comply with the act when it no longer has a subsidiary in Canada. They were wondering if this could raise different issues. It's possible. However, for the time being, we're using the tools that the act gives us.
There's no doubt that this power could be useful. As you recommended in your report recently, this authority is a missing element of Canada's Personal Information Protection and Electronic Documents Act. That's clear when this legislation is compared, even within Canada, with the laws of some of the provinces and those of our international counterparts in Europe and elsewhere.
This year, Canada is chairing the G7 and I'm chairing the Roundtable of G7 Data Protection and Privacy Authorities. However, I don't have the power to issue orders. So it's a gap that needs to be filled, because until it is, you have to go to court and spend time and money on litigation. However, it's simple to fix.
As you also recommended, we should also consider the possibility of imposing fines. We hear about class action lawsuits involving large amounts of money, and that's what gets the attention of management and encourages compliance.
Yes. I could issue orders, but I couldn't impose fines. Those recommendations would be made to a new tribunal, which would then decide whether or not to impose them.
It seems paradoxical to me that we're proposing a new law on privacy, when precedents already exist and already work in Canadian territories and provinces or in Europe, for example.
According to a recent report by the Canadian Security Intelligence Service, the Communist Party of China can access the personal data of TikTok users.
How could we better protect citizens in this regard?
As part of the study on Bill C‑11 which preceded Bill C‑27, we made recommendations, which we reiterated in the report on Bill C‑27 provided to the committee.
One of them was to include in the Personal Information Protection and Electronic Documents Act more specific rules on transfers of personal information outside the country.
At the moment, the act is quite general on the issue. It states that we must, by contract or otherwise, provide protection equivalent to that provided by Canada.
However, other countries in Europe have more rigorous protection regimes where there is talk of alignment. Those countries assess the other country's legal system and determine whether privacy is sufficiently protected. There may also be model provisions, among other things.
That said, the regime could be stricter, which would lead to greater protection.
We've made some decisions with respect to the use of a supplier's services in a third country. The Privacy Act doesn't prohibit that. In Canada, data can be transferred outside the country. The regime basically says you have to give protection equivalent to that in the other country.
When we look at such cases, we check that consent is given for the same purposes as those for which it's used in the other country. We also check whether the terms of use are transparent, which is to say, that users who give their consent are fully informed of the purposes for which their information may be used. In some cases, we found that acceptable.
Yes. If there's a concern about a lack of transparency and the use of personal information for reasons other than what was believed, then we may be called upon to respond to a complaint. We could also launch our own investigation, but generally speaking, we respond to a complaint.
Certainly, there have been a lot of really interesting and probing questions.
You mentioned in your opening remarks that the decision by the government was made pursuant to the Investment Canada Act, which allows for a review of foreign investments that may cause injury to Canada's national security. You went on to say that the guidelines on the national security review of investments include a number of factors, such as “whether a foreign investment could facilitate access to sensitive personal data, including personally identifiable health or genetic information; biometric information; financial information; private communications; geolocation; or personal data concerning government officials.” You also went on to make the distinction that your investigation is focused primarily on “younger users”.
I would ask that if the government makes such an alarming declaration, based on those guidelines, to actually ban a company from this country, why haven't you also undertaken to look at some of the issues that may be related to the review of foreign investments that might injure Canada's national security?
We're focused on the application of Canadian privacy law and, in this case, we're focusing in particular on the protection of children. This is one of our priorities. It's a priority that's shared with my colleagues in the province. We've issued statements on this, so we're moving forward on it.
The national security aspect is a different matter, and there can be some overlap in some respects, but that one is being taken on by the government. We saw that as well when the government took the decision to ban TikTok from the devices used by government employees, for instance. That decision was made and announced a couple of days following my investigation.
There are different tracks, and they're moving forward on them. Again, if there are some aspects in our investigation that touch upon foreign access or otherwise, then we could—
Let me put the question another way. Let's say that the government hadn't made this decision. Would it not at least warrant an exploration? These are very serious accusations, and the guidelines seem to have what I would describe—and maybe you can confirm whether you believe this to be the case as well—as a form of corporate espionage. I refer, for example, to backdooring sensitive information, not just of young users but of corporate CEOs, researchers, people in academia, government officials dealing with regulatory issues as well as politicians.
Would you agree that the guidelines for the national security review seem to point to the notion of a type of technological espionage?
I don't want to interpret the guidelines that are used and interpreted by the government. What I can say is that our investigation is focused on consent, on the appropriate purposes and the privacy practices of TikTok in terms of—and you're right—not just with regard to younger users, but all users. We are in the process of investigating this. We will be issuing our—
I'm going to ask you a couple of questions, because I want to make sure that we get to the heart of the matter here, so I'm going to ask them fairly quickly.
Have TikTok representatives, either through the Canadian subsidiary or the head office, co-operated with your investigation?
I wouldn't be able to say yes to that. No, I think we're moving along with our investigation at this stage, and we'll be concluding it as soon as we can.
We are moving along. I'm not calling for more powers in terms of enforcement during the investigation. I'm calling for more powers at the end of the investigation.
I'm calling for order-making powers. If I find a breach of the law, it would be to order an organization to comply with my findings. I'm not calling for powers of investigation; we have those powers already in the law.
We have sufficient powers in the law to do our investigations, but once the investigation is completed, if I find that the law was not complied with, I don't have the authority to order the organization, for instance, to stop doing something to change its privacy practice. That's the challenge.
Commissioner, you had indicated that your office learned about the government's decision to order the closing of TikTok's Canadian subsidiary, TikTok Technology Canada, at the same time the public found out about that. Is that correct?
We have a rather interesting situation insofar as the government has proceeded to shut down TikTok's subsidiary. They have withheld from Canadians the rationale for doing so, citing national security risks, which they claim are so severe these can't be shared with the public and, evidently, can't be shared with your office.
Has your office attempted to acquire further information as to the rationale for this that might be pertinent and informative to your investigation?
Going back to my point, we have the government saying, “We're shutting the office down. We have national security concerns. These concerns are so serious we can't inform the public.” It sounds as if the government isn't that interested in sharing anything with your office, but at the same time, it's "so serious" that Canadians can continue to use TikTok. I would submit that it doesn't add up. It doesn't make sense.
I guess when you look at the concerns the government has that have been identified, I would submit there are legitimate concerns about the fact that TikTok, being owned by ByteDance, a Chinese company, could theoretically be required to turn over data and other personal information of Canadians to the Beijing-based regime.
In that regard, I would note that article 77 of China's cybersecurity law ensures that data is collected and stored in China and that, when Beijing's Ministry of Public Security so orders, the data must be handed over, period, and so that theoretical concern is there.
Do you have any evidence that it is, in fact, happening? TikTok came before this committee and was absolutely adamant that such information has not been shared with the Beijing regime.
Again, this is an area where I can't venture because we are currently investigating. If it comes up, this is something we would indicate in our final report.
Shutting down the subsidiary of TikTok doesn't change the fact that Canadians are vulnerable to such information being shared with the Beijing-based regime, does it?
Yes, but as you noted, Canadian law still applies where there is a reasonable and substantial connection, but as you also said, your enforcement powers and the ability of Canadian courts to enforce matters and decisions don't exist, for all intents and purposes, in the country of China, which is controlled by the Beijing-based communist regime.
—with that preamble is simply that this hasn't changed a thing from the standpoint of protecting the privacy of Canadians, but what it has done is muddy the waters in a considerable way.
The government has been completely lacking in transparency, and it's completely unacceptable.
I can help my colleague clarify the situation. The decision to terminate is a business decision. That's clear. That's why it comes from the Department of Innovation, Science and Economic Development, not from Mr. Dufresne's office. I'd like to thank him for clarifying that earlier.
However, a number of things can happen at the same time in the world of social media. Frankly, for people who still have doubts, I have a tip: TikTok is a Chinese app, so if you have doubts, don't download it. It's quite simple.
I'd like to talk about an app that's much more common and popular in our area, Facebook or Meta. I have my own accounts. My children tell me that they're not very interesting, and so much the better. To me, it's a way of posting personal photos and communicating with the public.
Mr. Dufresne, a very interesting ruling has just been handed down concerning Meta, and I believe that your office was involved in it.
Can you tell us about the regulations that have been announced and whether you're satisfied with them? Does this show that your office has powers?
However, on our side, we investigated the Cambridge Analytica scandal, which affected Facebook. We had to take it to the Federal Court because we don't have the power to issue an order, and we won in the Federal Court of Appeal this fall. That was an important victory. The Federal Court recognized that, in the Cambridge Analytica case, there had been a shortcoming in the way Facebook obtained consent and protected user information. Facebook is now trying to go to the Supreme Court to overturn that decision.
It's still reassuring, because, not so long ago, five or ten years ago, we seemed powerless before the web giants, like Facebook, and now we know that there are two cases where your office or users were able to take action against that company, and I'm sure there will be more.
Can you tell us a little bit more about how that went? As for TikTok, I understand that this is still to come.
What I would say about the cases involving Facebook is that it shows that the regime can apply and decisions can be obtained, but that it still takes time. The Cambridge Analytica case, for example, goes back to 2018. It would take less time if I had the authority to issue orders. That's why we recommended this and your committee recommended the same thing. That's the challenge I still see, because technology and websites are changing very quickly. Ideally, we need to be able to make decisions and execute them more quickly.
I agree, because we know that web giants have already resorted to tactics such as intimidation, blackmail and threats. That's disappointing. That reminds me of other times when we depended on the railways or other forms of communication. It's important for the government to take action to protect people.
Do you think the current legislation is adequate? I know you have very little time.
We recommended it in the case of Bill C‑27, and the parliamentary committee accepted that recommendation. However, it's important that this be done in both acts.
For our part, we verify whether the use of an application is consistent with the law. If it's not, we'd like to see it amended and fixed. In extreme cases, the solution may be to completely ban a use. In other cases, it might be enough to change the practice, increase protection, clarify consent. That's what we're looking at.
Last December, ByteDance admitted that it had been spying on U.S. journalists to identify their sources. In response to that, TikTok said they'd been working on better structures.
Every platform has its own practices. We check what's going on there, and we get complaints.
I announced today that my office had concerns about LinkedIn and the training of its artificial intelligence models. We contacted the company, and they responded well. They put in place a moratorium for the duration of our discussions. That's another example.
The solution isn't always a full investigation. Sometimes that can be done through education and exchange. We try to use all the tools we have.
We have a complaint with Meta, with Facebook and with this one. We had the announcement of our decision on Aylo and MindGeek. Those were the big ones that were concluded and made public.
In your estimation, do you see commonalities in the themes and the way in which privacy breaches are prevalent in what is otherwise a form of surveillance capitalism?
One of the things we've seen this summer, and we've done this with international partners and partners in Canada, is what we call a "privacy sweep". This year, we did it not only with privacy authorities but also with competition authorities. We looked at what we call "deceptive design practices", which are practices where an organization is going to use tools to manipulate users into making choices that are not in their interest.
Sadly, what we found was that 97% of the organizations and sites that we looked at—and we looked at a lot of them—had at least one of those bad practices, in terms of having language that's not clear—
Getting back to comparators—I want to focus on that for a moment—obviously we have lots of platforms, with all of them having very similar business models in how they advertise, profile and understand the end-user. Yet, TikTok was singled out and banned from this country without, I would argue, a real full public disclosure.
Do you believe that it's important for the public to fully understand the impacts on their privacy by their use of these platforms? Additionally, given the fact that TikTok has been banned, do you think the government has a duty to report back to the public as to why?
I think transparency is important. The more the public can understand the decisions of the government and the decisions of my office, the better. There may be some limits in terms of confidentiality, but certainly this is important.
We initiated this investigation vis-à-vis TikTok in particular because of the large number of younger users on the platform. That is a strategic priority that we have. We're moving forward on that and hoping that our conclusions on that will be beneficial to others who may have similar practices.
I want to thank our witnesses for being here again. I'm a relative newcomer to this committee. It feels like this is a bit of a regular occurrence.
Commissioner, I want to pick up from where Mr. Green left off. You said that younger users are targeted, which I found interesting. It seems to me that social media generally targets young users period. I think we can agree on that, right?
That's right. Those were my words. You said it was a higher number.
You have several issues with privacy. You have issues with privacy that are contemporaneous with use. Then you have issues with privacy that exist regardless of whether someone uses this—tombstone information and that type of thing.
Obviously, younger users become older users. Would the concerns not be the same across social media platforms that would apply despite the fact that TikTok is targeting younger users?
We've made the protection of children's privacy one of our priorities. That's something that's shared internationally. We did it recently with the G7 counterparts. We issued a statement about building AI with the best interest of the child at heart. There will be some different considerations. We called for stronger privacy protections for younger users—for instance, for stronger rights to the deletion of information that might have been posted as a child. That's what I meant when I said that children have the right to be children. If you hold them to the same level of accountability as adults, because you're leaving things forever, that's a challenge.
We're looking at that. We're looking at making sure that the lens of children's best interest is always there and that those privacy protections are stronger by default.
Okay. It seems to me that this would be part of the transparency that I would hope we see. I didn't think that was a question in dispute. Obviously, it may well be, so I'll pivot to the issue of transparency.
My colleague Mr. Cooper spoke about the failings of the government. As I understand it, there are two parallel issues. You have the government winding up the TikTok business entity in Canada, and then you have a related issue with your investigation. As I understand what you said to Mr. Cooper, you are not privy to the Government of Canada's rationale, I guess, for winding up TikTok. Is that correct?
You said, if I understood you correctly, that you would be completing your investigation notwithstanding whatever the Government of Canada has decided. Is that right?
Yes. We're moving forward with our investigation. Again, whatever information we may need or obtain in the meantime, we would do that, and we would finalize the investigation. Then those findings and the full rationale will be made public at that time.
At the conclusion of your investigation, which you've done independently of the government's rationale, wouldn't it make sense to go to the government and determine what...and the reason why to further inform your decisions in order to determine whether further investigation is required? It's almost as though you're operating in a vacuum with maybe 70% of the information.
Right. Again, I think this would be part of the decisions we make in the context of that investigation. If we decide, well, we'll reach out to X to obtain this information and to inform our findings, that may occur, and then we would make that public at the end of the investigation, once it's concluded.
What I'm saying is that I'm not confirming the investigative steps that we've taken in the past or that we would take in the future. The investigation is ongoing. We want to conclude it as soon as possible and have those findings for Canadians.
I'm just trying to probe the depths of the investigation, especially as it relates to national security. It would seem that would be a necessary step. I'm not trying to challenge or suggest your investigation won't be fulsome. I'm saying that it seems to me that the national security element and the privacy element really would at least potentially go hand in hand.
I think Mr. Barrett might have talked about this a little bit. Realistically, under PIPEDA, for both Canadian and foreign companies, what can you do if they fail to provide the requested information that you need?
You talked about jurisdiction and then you talked about enforcement.
Realistically, what can you do for a Canadian company with a Canadian subsidiary and a foreign company without one?
I think the big difference is if we need to compel. It's if there's a lack of collaboration and we need a court order to order the disclosure of something. That can be easier if the entity is in Canada than if it's not. That's where this would come up.
Under the law, I have powers to order a company to give me documents or to provide access to some of its information. We can exercise those if a company is in Canada. If a company is not in Canada, then that can raise different questions.
Realistically speaking then, if TikTok were to close up shop in Canada and still do something that negatively impacts Canadians' privacy, then you do have the ability to enforce PIPEDA, but you'd need to do it through an MOU or through the courts.
Right. If PIPEDA applies to something because there's a real and substantive connection to Canada, then we can get an order from courts. The question becomes making that order recognized in another jurisdiction.
As you say, this could be done by MOU or by international recognition.
Brenda talked a little bit about the Facebook versus Canada case. You said it was a very important ruling where global tech giants “whose business models rely on users' data, must respect Canadian privacy law and protect individuals' fundamental right to privacy.”
How does this case affirm your oversight and jurisdiction over foreign entities?
How does it affirm your jurisdiction over tech giants that have deep pockets, have been willing to spend a ton of money and have historically acted bullishly both in Canada and globally?
Courts in Canada have recognized that because of the nature of data and international relations, it's not necessary for a company to be domiciled in Canada for Canadian courts to have jurisdiction. That's the real and substantive connection test. Courts have recognized this. If Canadian users are impacted, Canadian institutions will have jurisdiction on it.
That's why, in a case like Facebook and in other situations, we are able to assert jurisdiction independent of whether an organization is in Canada or not, provided that Canadian users are impacted.
Can you talk a little bit about the Facebook case where they refused to say they did anything wrong, but paid $9 million to some Quebec educational entities? I think it was a class action suit.
That's a class action settlement. That is distinct, and we were not involved in this matter. I think what this highlights is that we need to have the ability in privacy law for my office to be able to issue orders and to issue fines, because we see the impact that these financial amounts can have. Ideally, they won't be imposed, because organizations are going to do the right thing, but it helps decision-making.
What we have now under Canadian law with our Facebook case is that we have to seek an order from the court and to push in that direction. So far we have not obtained financial compensation in that case, although I will continue to push for that.
This may not be a totally related question, but I'm wondering how, if they refuse to admit any wrongdoing, and they take their deep bank accounts and throw a little bit of money at something to make it go away—and it does end up going away—that gets us to a point, moving forward, where we can get them to act less—and I'll use the word—“bullishly”. How do we get to that place where we actually...because I think until we have a court case where we truly have someone say, okay, we screwed up, we're wrong...?
There are a number of tools. We can do that with legislation. If you amend legislation as a parliament and you provide certain specific requirements, then that has an impact on organization. You have an impact with international dialogue and collaboration. We're working very closely with the G7 colleagues, with international colleagues, and are making international statements on AI and good practices. One of the reasons I'm advocating for order-making power and fines is that then you have a court order that says, “Here's what you need to do”, you have fines, which focus the mind, and then you can use the promotional work as well to really build that culture of privacy.
That's it. Thank you, Mr. Fisher. I did start your clock a little bit late, so you had the six-minute round plus on that one.
That concludes our panel.
[Translation]
Thank you once again, Mr. Dufresne and Mr. Chénier, for being here today.
[English]
Mr. Dufresne, I want to thank you for your professionalism as well. This was a fascinating discussion.
Just to let committee members know, on Thursday we'll have CSIS here as part of our study. They'll be here in the second hour. H&R Block is coming in the first hour for the CRA study.