:
Good afternoon, everyone.
I'm going to call the meeting to order.
[Translation]
Welcome to meeting No. 94 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics.
[English]
Pursuant to Standing Order 108(3)(h) and the motion adopted by the committee on Tuesday, January 31, 2023, the committee is resuming its study of the use of social media platforms for data harvesting and unethical or illicit sharing of personal information with foreign entities.
[Translation]
Today's meeting is taking place in hybrid format, pursuant to the Standing Orders. Members are attending in person in the room and remotely using the Zoom application.
[English]
I just want to remind all members today that care must be taken with regard to the earpieces for interpretation. Please be mindful to not place your earpiece near the microphone, as this can result in feedback for the interpreters and may cause acoustic shock, which could in turn cause injury to our interpreters.
We have a witness in the first hour on Zoom. I will remind the committee that they have been tested and have the appropriate headwear.
I'd now like to welcome our first witness today. We have, as an individual, Dr. Anatoliy Gruzd, professor and Canada research chair in privacy-preserving digital technologies from the Toronto Metropolitan University.
Dr. Gruzd, you have up to five minutes for your opening statement.
Welcome, sir. Go ahead, please.
:
Thank you, Mr. Chair and committee members, for this opportunity to discuss the potential threat of foreign interference and the risks associated with the misuse of social media data.
I'm Anatoliy Gruzd, a Canada research chair and professor at Toronto Metropolitan University. I'm also a co-director of the social media lab, where I study social media's impact on society, information privacy and the spread of misinformation around conflicts such as the Russia-Ukraine war.
While my comments today are my own, they are grounded in research conducted at the social media lab and are informed by 15 years of working with various types of social media data.
As previous witnesses have testified, there are concerns that TikTok could be vulnerable to foreign interference, leading to major implications for our national security and individual privacy. However, I would like to point out that a loaded gun is different from a smoking gun. Despite its being framed as a national security threat, to date, there's still no public evidence that the Chinese government has spied on Canadians using a back door, or privileged access, to the TikTok app.
That is not to say there is nothing to worry about. There are valid concerns regarding the potential for TikTok and other platforms to be exploited by malicious actors for propaganda and radicalization. For example, Osama bin Laden's 2002 “Letter to America” recently resurfaced on TikTok and was seen by millions. However, these concerns are not limited to any one platform. Rather, they represent broader challenges to the integrity and security of our information environment.
As such, we must take a comprehensive approach to addressing these issues by compelling platforms to commit to the following: adopting the principles of privacy by design and by default, investing in expanding their trust and safety teams, and sharing data with researchers and journalists.
I'll expand each of these points.
Teaching digital literacy is important, but it's unfair to place all the responsibilities on individuals. Social media platforms are complex, and algorithms that decide what users see and don't see remain black boxes. The only true choice we have is to disconnect from social media, but it's not realistic or practical, as our own research has shown, because most Canadians have at least one social media account.
It's important to shift the focus from individual responsibility to developing strategies that compel companies to implement privacy by design and by default. Currently, it's all too common for platforms to collect more data by default than necessary.
However, even with privacy protection settings enabled, Canadians may still be vulnerable to malicious and state actors. According to a national survey that our lab released last year, half of Canadians reported encountering pro-Kremlin narratives on social media. This highlights concerns about the reach of foreign propaganda and disinformation in Canada, extending beyond a single platform.
In another example, earlier this year, Meta reported a sophisticated influence operation from China that spanned multiple platforms, including Facebook, Twitter, Telegram and YouTube. The operation tried to impersonate EU and U.S. companies, public figures and institutions, posting content that would match their identity before shifting to negative comments about Uyghur activists and critics of China.
To fight disinformation, platforms should expand their trust and safety teams, partner with fact-checking organizations and provide access to credible news content. Unfortunately, some platforms, like Meta and X, are doing the exact opposite.
To evaluate how well platforms are combatting disinformation, Canada should create an EU-style code of practice on disinformation and a transparency repository that would require large platforms to report regularly on their trust and safety activities in Canada.
To further increase transparency and oversight, Canada should mandate data access for researchers and journalists, which is essential to independently detect harmful trends. In the EU, this is achieved through the new Digital Services Act.
Currently, TikTok doesn’t provide data access to Canadian researchers, but it does so for those who reside in the U.S. and EU. Sadly, TikTok is not alone in this regard. Recently, X shut down its free data access for researchers.
In summary, while it's important to acknowledge the impact of foreign interference on social media, banning a single app may not be effective. It could also undermine trust in government, legitimize censorship and create an environment for misinformation to thrive.
A more nuanced approach should consider the various forms of information and develop strategies to address them directly, whether on TikTok or other platforms. This may involve a wider adoption of privacy by design and by default, expanding trust and safety teams in Canada and compelling platforms to share data with researchers and journalists for greater transparency and independent audit.
Thank you.
:
Thanks. I hate to cut you off, but we have limited time here.
It's interesting that you would bring that up. I know that we and a number of other committees addressed foreign election interference. The use of social media was a key part of that. Certainly, if you have further comments, I would invite you to send them to the committee.
I want to go to a bit of a grey area. We had TikTok before this committee, and they said, oh, privacy is great; all they require is basic information, and their settings are set up for kids. I'm paraphrasing, obviously, but very few people read the entirety of terms and conditions. Very few people understand what information is explicitly being provided. Even fewer, I would suggest, understand how impactful the information they provide is, whether it be pictures of the front of their homes or themselves on holiday.
I'm wondering if you could provide guidance to this committee, in the minute you have left, on how to balance freedom of expression, the advancement that's taken place in the social sphere, and ensuring that Canadians' privacy and safety is safeguarded.
:
Thank you very much, Chair.
Thank you, Mr. Gruzd, for coming in today. We really appreciate your time.
I'll start by continuing where Mr. Kurek was leading.
In the context of the Israel-Palestine war, we've seen Canadians, especially young people, being targeted for posting their views online, to the point where their employment and education are impacted. There is a kind of grouping culture online, regardless of which side of the issue they're on, and online targeting of individuals for expressing their views.
Do you think that social media companies have a responsibility to provide protection and maintenance of freedom of expression, especially for young people online?
:
The reason I pause is that it goes hand in hand with the type of influencer content that individuals are consuming on these platforms that would trigger or lead them to certain expressions.
One concern we've observed over the years when conducting surveys with Canadians is that more of us are turning to social media for information about conflicts like the war in Ukraine or the war in Palestine.
What if there are no credible news organizations that provide that content? The reactions that you see quite often on social media platforms are driven by the influencer content that provides the news.
When we asked TikTok users in Canada, half of them said they use the platform for news about the war between Russia and Ukraine. This is concerning, because when you go to this platform and you search for trusted news sources, the most popular ones will be CTV, Global News and CBC, according to the digital trust rating. Their number of followers is 160,000 or 150,000. They cannot compete with influencer content.
Freedom of speech is important, but it's just as important to make sure that when our citizens—Canadians—are participating in those platforms, they have access to credible information when they react to it online.
:
My point about not putting all the responsibility on the individuals comes from several directions. First of all, even if individuals know how to change privacy settings, many platforms will have access to their private messages. While they feel they're protected, they're actually not.
Education is important, but it doesn't necessarily mean training individuals. It's hard to change individual behaviour, but platforms can incorporate tools that can make them more efficient and effective in terms of protecting themselves.
Here are a couple of simple examples. When you go to many browsers now, they have a button when you mouse over a picture that you can use to search and find related images. It's a simple tool that I am happy to train people on, but it's already an embedded part of the platform.
We haven't talked about generative AI, but that's the next stage of this evolutionary process. How do we make sure the tools that individual users can use to detect what is real and what is authentic...? It's not a part of these platforms. It could be through digital certification or it could be through other means, but those should be part of the platforms.
The other quick point about education is that it's much more effective to institutionalize the training.
I'll give you another example. When I was preparing for this meeting, there was a test for Zoom and the instructions told me to go to incognito mode in this browser. Providing instructions is part of the process; it's part of the institution. It's much more systematic and effective.
:
The Social Media Lab produces a report, every two years, entitled “The State of Social Media in Canada”, and we ask Canadians what platforms they use. Certainly most of the top nine platforms would be North American and U.S.-based, except TikTok is the fastest-growing platform. Around one-third of Canadians use it.
Another platform, which hasn't reached a 10% adoption rate in Canada, is Telegram. It is being adopted quite widely around the world. In fact, the rate in terms of the service rating I mentioned is, interestingly, B, so it's quite high versus E for the rest of the platforms. While it's privacy friendly or conscious of users, it's full of Kremlin propaganda discourse, so you pick your poison, unfortunately.
I would definitely keep an eye on Telegram and a lot of messaging types of apps.
I had a question earlier about WeChat and such. Those are really hard to study. Anything this committee can do to help mandate platforms to share insights on those platforms and their public groups, where most of that originates or is propagated, would be very helpful going forward.
:
The question is a bit broad, but I'll try to contextualize it.
When we're talking specifically about generative AI tools, the concern for me, from the data privacy perspective, would be Canadians going to websites like ChatGPT. They will tag their private and personal information into the window without realizing that they are actually consenting to that data being used for future training. They don't know whether that content will be printed out or spit out in somebody else's stream. I think that would be one form of concern.
The other form of concern, of course, is social media platforms relying on AI tools to detect harmful content, just because of the scale of the problem. Earlier this year I was looking at some of the transparency report charts from Meta, showing how they removed around 65% of content automatically that was classified as harassment and bullying. There's still a significant percentage, around 35%, that users had to report for platforms to act on. From that perspective, it is important to flag some of that problematic content that they won't have enough human content moderators or fact-checkers to look at.
When we look at AI, I think we have to differentiate the kind of use case we're actually talking about.
:
Some of the privacy legislation tools you're considering may be effective in terms of making sure Canadians can request that their data be removed from some of those services. That could be quite effective.
The other aspects I referred to earlier, in my opening remarks.... It's about creating a repository and code of conduct for this information, in particular. Right now, it is happening and functioning. Major online platforms in the EU—these are defined as platforms with 45 million plus—report, usually every six months or so, on their activities and what they've done to stop foreign interference, country by country. We don't see any stats about it in Canada.
Related to your question about AI, when platforms take action on AI-driven content, I would like to see how much of that content.... What was the purpose?
I think that will inform our next steps.
Doctor, it's great to see you.
There have been some very great questions from all parliamentarians here.
I'm going to approach the next series of questions in a couple of ways.
First, what can average Canadians out there do to protect themselves from disinformation and misinformation? That's one.
However, you also brought up, on several occasions, what communities are doing regarding terms of service—an initiative. I'd like you to unpack that, and the EU code of ethics.
Are there three things the Government of Canada can do to bring TikTok and other social media platforms to the table in order to ensure there's less misinformation and disinformation from an economic standpoint, domestically and internationally? I think MP Green highlighted that. He has done so very effectively on many occasions.
That would be the series of questions I have, and I can unpack those as you go.
:
We have individual education and what individual Canadians can do. We have to talk about what age group we're discussing. Earlier, I heard in this committee that the focus is on the underage population, which is a quite important and vulnerable group. However, sometimes we overlook older adults and other age groups.
Frankly, education shouldn't stop, but we cannot prepare individuals for all cases. That's why I mentioned earlier that platforms should be compelled to incorporate tools that can signal whether something is potentially problematic. We had a great example during the COVID pandemic, when platforms stepped up and provided useful interventions—even simple things, such as adding a link to Health Canada when somebody talked about COVID, or flagging that some of the content in the post may not accurately relate to scientific knowledge. Those interventions are in fact helpful in reducing the spread of misinformation and disinformation. Unfortunately, lately we are seeing those initiatives being dropped completely. The things we learned from those initiatives are not applicable to other domain areas.
If we are talking specifically about the education of younger adults or teenagers, we can't just think about traditional.... We can teach those skills. Also, look at interesting interventions, such as games that essentially show.... Put them in a position of running an information operation. There are a number of interesting studies that show the effectiveness of these campaigns. They have to make themselves run such a campaign, and in that situation you actually then become more aware of things that may be coming at you in your real-life interactions.
Can you please repeat the other aspects of the question?
:
Thank you very much, Mr. Chair.
Thank you for your testimony. I'm finding it very helpful.
We're in a unique opportunity. We have the former president of the Treasury Board here at committee now. We know that the decision to ban TikTok was one that was made by the chief information officer, who we'll have before committee. We have heard in previous testimonies from CSIS and from our Communications Security Establishment that they provided advice to the chief information officer. They wouldn't get into what the details were of their advice, but they provided advice and ultimately the decision back in February 27, I believe, was to ban this from government devices.
I would give you this opportunity, sir, and ask you this, with your subject matter expertise: If you were advising the chief information officer under this proposed ban of TikTok, what advice would you give them, and what other areas or topics might you have covered?
:
I call the meeting back to order. Welcome back, everyone.
I'd like to now welcome our witnesses for the second part of our meeting today.
First of all, from the Royal Canadian Mounted Police, we have Deputy Commissioner Bryan Larkin, who is responsible for specialized policing services. Welcome, Deputy Commissioner.
We also have Brigitte Gauvin, who's the acting assistant commissioner of federal policing, national security.
Also with us today, from the Treasury Board Secretariat, is Catherine Luelo, deputy minister and chief information officer.
I understand, Deputy Commissioner Larkin, that you have an opening statement.
I'm just confirming, Ms. Luelo, that you do not have an opening statement. Is that correct?
[Translation]
Good afternoon, Mr. Chair.
[English]
Hello, honourable chair and members of the committee.
My name is Bryan Larkin. I am the deputy commissioner of specialized policing. I'm joined by Assistant Commissioner Brigitte Gauvin.
First, I would like to thank all of you for the opportunity to discuss the issue. The exploitation of the personal data of Canadians by foreign actors and the commission of crimes in the digital space are of the highest priority and among the key mandates of the Royal Canadian Mounted Police.
Foreign interference affects every aspect of our lives, from the foundations of our democracy and our economic prosperity to the critical infrastructure essential to our well-being and the fundamental rights and values that define us as a society. It is a multi-layered threat, with foreign actors seeking to advance their objectives in a myriad of ways, including through state-backed harassment and intimidation of individuals and communities across Canada.
Make no mistake: Foreign governments are leveraging data harvested through popular social media platforms to profile individuals and conduct misinformation and disinformation campaigns within Canada. Among other threat activities, online data is also being used to identify and repress political dissidents who seek refuge in Canada.
Foreign interference actors are also making nefarious linkages to criminal organizations, which facilitate the commission of and profit from illicit activities such as online fraud, cyber-espionage, child exploitation and intellectual property theft.
With these considerations in mind, today we will briefly cover the RCMP's role in contributing to the protection of all Canadians from foreign interference in the cyber realm.
As Canada's national police force, the RCMP is mandated to investigate criminal activity related to serious and organized crime and national security, which includes instances of foreign interference conducted through online means. Through our national cybercrime coordination centre, the RCMP works with all law enforcement and other partners, including the Canadian anti-fraud centre, to help reduce the threat, impact and victimization from cybercrime within Canada.
In 2022, more than 30,000 reports of cyber-enabled fraud and scams had a 35% nexus to social media platforms. We also work closely with police services across our country, as they are often the first law enforcement entities to learn about state-backed cybercriminal activities targeted at Canadians.
While the RCMP is investigating cyber-threats and actors, Canadians also need to recognize the dangers as well as the impact of online activity. In particular, it's critical for all of us to understand that everything we share is collected and stored on servers. These are often located outside our national borders, where privacy rights may not have the same meaning as they do here. In essence, we leave a digital footprint across the nation.
In some foreign jurisdictions, national security laws oblige social media companies to share this personal data collected from international users with local governments. This data is then used to harass, coerce and/or threaten dissenting voices, political leadership and our diverse communities abroad, and/or to facilitate cybercriminal activities.
Youth are particularly vulnerable. They're vulnerable to cybercrime. They tend to trust in the digital environment without fully grasping the risk associated with the digital platforms. Their extensive use of social media platforms coupled with the tendency to overshare personal information makes them particularly attractive targets for cybercriminals.
Our national youth services are engaged and educate young people about online safety through collaboration with school resource officers and various organizations. Additionally, the RCMP is committed to and continues to work with our diverse communities and newcomers to provide them with information, including safety tips and how to recognize fraudulent calls and phishing scams.
NC3, which is our cyber coordination centre, and our anti-fraud centre are also engaged in the Government of Canada's “Get Cyber Safe” public awareness campaign. This aims to inform all Canadians, including youth, about cyber-threats and prevention.
The RCMP also produces operational bulletins and reporting tools for frontline police officers, strategic partners and the public, with the goal of increasing reporting on federal crimes and engaging with culturally diverse communities.
The protection of Canada and the safety of its citizens and residents are paramount to the RCMP. It will be important for all aspects of society to work together to protect against foreign interference in this space.
Thank you.
:
Thank you very much. I intend to take only a couple of minutes. I want to leave time for committee members to ask any questions they may have about this important topic.
Thank you for having me today virtually.
I think the deputy commissioner outlined a number of things very well. I will take just a minute to situate my role.
As the chief information officer of Canada, I am accountable for ensuring that we have clear rules and guidelines around the usage of Government of Canada devices. That's the purview through which I made the decision on TikTok.
When we're looking at making decisions around what acceptable use is in government devices, we balance a whole bunch of things, including things like privacy, what is acceptable use in business environments, and cost. All of these things go into deciding what we allow on devices.
Maybe just as a last comment, it would be my best advice that we continue to tighten our environment in terms of the use of Government of Canada devices. We have a fairly open environment, in which about 90% of Government of Canada devices allow downloads of whatever the user would like.
We have partitioned devices, some for business and some for personal use on one common device. From my experience in the private sector, that's not usual, so it would again be my advice, and the direction in which I've been moving the organization, to further tighten that environment so that we balance out the use of devices for government business and government business alone. In doing so, we are going to have a knock-on effect that I think is going to better protect the privacy of our information.
I look forward to your questions, and I will pass it back to you to allow as much time as possible for that.
Thank you.
:
Thank you, Chair, and thank you to the witnesses for your attendance today.
With time permitting, I will circle back to the focus of this meeting, that being social media and foreign interference, but there is another pressing issue that Canadians want answers to.
Deputy Commissioner Larkin, all my questions will be directed towards you.
You'll agree that there are basic legal tenets under criminal law; namely, that ignorance of the law is no excuse and that no Canadian is above the law. That includes all members of Parliament and the himself.
Would you agree with that?
:
If Ms. Fortier or any member of the Liberal bench wishes to continue to raise a point of order on my questions before the question is even put to the witness, we are defeating the purpose for which we are here. I hear from Ms. Fortier that she wants to deal with questions surrounding social media and foreign interference. If she and her colleagues continue to interrupt me, there's going to be very little time for them to have the opportunity to deal with what they believe to be relevant questions.
I agree with you, Chair, that relevancy is a very subjective art. I stated it at the outset. With time permitting, I will be circling back to the content matter of this meeting, but I object to this constant interference by the Liberals.
Some hon. members: Oh, oh!
Mr. Larry Brock: They're laughing. Yes, you can laugh all you want, Carolyn Bennett, because it's not funny to Canadians.
:
Thank you very much, Mr. Chair.
Mr. Chair, you know that I have a lot of respect for you. I think you're doing a good job in the seat there. I'm going to share with you what my concern is with what's happening right now.
My concern is that as somebody who will often use procedure.... I know that my good friend Mr. Brock will have an appreciation of this, as we spent a lot of time together on DEDC. My concern is that when filibusters arise—and they do—you will know that I often will reflect on the ruling of relevance. If what we're doing now is setting a precedent that allows for any and all topics to be debated at any and all times, that's going to affect my future interventions on relevance when it comes to filibusters.
I know that Mr. Brock has a deep respect and consideration for procedural rules, and I would ask that we get back to the study at hand, so that in future debates when I call a point of relevance you won't reflect back on today and say that anything and everything is fair game.
:
I intend to. Thank you, Chair.
The Chair: You have the floor for one minute and 40 seconds. Go ahead.
Mr. Larry Brock: Again, Deputy Commissioner Larkin, hopefully, I can get this question out.
My social media is absolutely abuzz with concerns in regard to this particular area.
Some hon. members: Oh, oh!
Mr. Larry Brock: I can mention social media a thousand times to satisfy my Liberal colleagues, but social media is such that Canadians want to know: Is the RCMP impervious to the thought that the is incapable of being charged with a criminal offence?
I know this is a sensitive matter, but I asked you for a pointed response, and I'm not getting a pointed response.
If the RCMP had reasonable and probable grounds to believe that our , had been involved in a criminal offence, which is your legal threshold—reasonable and probable grounds—and you consulted with the appropriate legal authorities—you consulted with the Department of Justice; you consulted with provincial and territorial Crown attorneys—if they gave you the green light that the facts and the evidence were there and that your legal threshold was met, can you advise Canadians that in that hypothetical you could charge with a criminal offence? Yes or no, sir.
I think now you know why it's very important to do a study on misinformation and disinformation, both domestic and foreign.
A big worry for me.... I'm a father of a 15-year-old and a 12-year-old, so this is a generational risk where we're trying to find out how we can alleviate some of the concerns we all have.
My question is for the Deputy Commissioner.
I will go to you first, please. With the abundance of information that people are receiving, what are common challenges faced by law enforcement in dealing with cybercrime on social media platforms?
:
I think one of the challenges that we're seeing is the amplification of social media around involvement in criminal investigations. Generally speaking, in the majority of investigations that we now touch, whether it be a low-threshold crime, a property crime, a violent crime or exploitation, there is some form of digital entity tied to it.
The capacity for us as a national police service, the capacity for our partners and our police of jurisdiction, is that it has changed from what used to be the fundamental investigation, which was in a neighbourhood or was in a schoolyard, etc.
What we're seeing, particularly in this instance, are foreign actors who are using and amplifying social media to target Canadian citizens and/or citizens who are living from abroad in our country. That presents a significant challenge. We don't monitor social media. We obviously use it as an investigative tool or capacity, but when you look at all the various social media platforms, the reality is that the amplification of social media in criminal investigations is impacting everything we do, every single day.
:
To clarify, we don't actively monitor social media; however, we do use it as part of our investigations through open-source information. We use software that refines our searches as a part of our criminal investigations or the work we do.
Through our national cybercrime coordination centre, we have ongoing relationships with all social media platforms. We have protocols in place, particularly around child exploitation and harm to young people. Those are all things that we do.
Obviously, we are working internally with the Government of Canada on online safety and future legislation, etc. However, again, the sheer nature of this is that we work with other police jurisdictions. Information is shared with us. We obviously use that to advance investigations. We follow lawful access, production orders and/or search warrants to obtain further information from social media platforms. We have ongoing protocols with their security departments to receive and retrieve that information.
When you look at every piece of social media that we identify and track and/or use as part of our investigations, it is evidence. It has also increased the demand within our organization. The demand on policing is fairly significant.
:
I'll take that question, Mr. Chair.
What the national security program does is investigate criminal activities. We don't investigate social media, and we don't investigate if there's misinformation, disinformation or influence. If the criminal activities pertain to foreign interference, we absolutely will investigate that under our mandate.
As part of our investigations, we can obtain information through social media subscriber information and other information we can obtain, either through open source or via judicial authorizations. For the national security program, the criminal activity has to pertain to foreign actor interference, for example.
:
There was something that kind of annoyed me. I'll share this with you.
I was on public accounts and government operations, and one of my first studies was an audit on our current state of technology. I remember asking the question, do we still run things on DOS? Is it that old? In some instances the answer was yes, it was actually that old.
Of course, at that time, we had a Liberal government, which said it was going to usher in this new age of openness and transparency. There was a minister for digital governance, and then, kind of unceremoniously, they just disappeared sometime in 2019.
Do you regret that? Do you think that if we had maybe kept that mandate and that whole-of-government approach—using Liberal jargon—there might have been some headway, some kind of fiscal intervention or investment by this government to get to where you wanted to go?
:
The meeting is back in session.
There's been a motion moved by Ms. Khalid. All members of the committee should have that motion. It's in relation to the current study, so it is an admissible motion.
Before we get into debate—and I see your hand, Mr. Kurek—Deputy Commissioner Larkin and Madame Gauvin, I'm going to release you at this point, if you don't mind. I want to say thank you for appearing before the committee today and providing valuable information.
Ms. Luelo, I'm going to release you as well.
It's not as easy for me. You have to click “cancel” or “leave meeting”, but I do appreciate all of you for being here today. Thank you.
:
Thanks very much, Chair.
I find it interesting. Reading this motion, it shows that a filibuster by any other name.... I've not, in my experience—now having been elected for a number of years and having spent a fair amount of time at this committee and others—seen a motion that says “as many meetings as possible to complete the witness testimony”.
Certainly, I understand there are some topics of discussion that this committee has undertaken that make the government uncomfortable, but I think it's a wide-reaching motion that basically says this may never end. I think it is concerning and certainly indicative of an ulterior motive.
That's not to diminish the importance of the subject at hand, but I think, Chair, we've had the conversation before that we can't walk and chew gum at the same time.
I would, just if I could.... I don't want to give up my time, Chair, but I know this committee has spent some time working out a work plan. I understand that the next two meetings are particularly seized with this, so in terms of information for the committee, I believe it would be relevant...and then I have some further comments, so I'll certainly continue on that.
However, Chair, I'm wondering if you could direct the clerk or the analysts to share the specifics of what the work plan includes, specifically related to the next two meetings, and then I'll have a couple of further comments.
:
I appreciate the question, Mr. Kurek. You'll still have the floor when we return.
We have witnesses for the 29th. We've been working on witnesses for December 4. We actually have the notice of meeting ready to go out. They would be Mr. Caraway, who, unfortunately, had some technical issues; Dr. Emily Laidlaw, associate professor and Canada research chair in cybersecurity law from the University of Calgary; Mr. Matt Malone is scheduled to appear; and from the Dais, we have Sam Andrey, managing director, and Joe Masoodi, senior policy analyst.
I'm going to refer to the clerk to speak specifically about the meeting on December 4.
Do we have witnesses at this point for December 4, Madam Clerk, or are we still waiting to hear?
It's very telling, I think—and this committee is seized with what is an important discussion surrounding social media and its impact on Canada and Canadian young people—that passing the motion as it's written, Mr. Chair, would effectively override the work plan that has, in its future, the commissioner of the RCMP coming to appear regarding SNC-Lavalin. I think it's pretty clear that there is an ulterior motive.
As well, I would note that, with the witnesses to appear, they have Meta—Facebook. There's a vice-president for Google and a managing director of Twitter Canada. However, then they ask the CEO of TikTok. For consistency, to ask executive members of those organizations to come, I think, would also be very reasonable when having this discussion.
That point aside, Mr. Chair, I would move an amendment to Ms. Khalid's motion. It would simply be that (c) be deleted from the motion.
:
Chair, that we dedicate as many meetings as possible to complete the witness testimony.... “As many meetings as possible” is, I guess, limited only by the resources of the House. It's not as many meetings as necessary.
I don't think having meetings for the sake of having meetings is going to accomplish what we need to do. Also, making sure that we actually have business to populate “as many meetings as possible” would be important.
If we're sending a summons to folks who have already declined.... We found ourselves in this situation at this committee before, which netted the same result with folks not headquartered in Canada. We saw that with the CEO from Meta before. I would expect that should the CEO of TikTok, the parent company, not be in Canada, we would receive a similar response.
Not having limitless meetings is important. We have meetings that are programmed already, and we have....
I'm sorry, Chair—
A voice: [Inaudible—Editor]
:
I'll call the question.
Mr. Michael Barrett: Can we get a recorded vote, please?
The Chair: We'll have a recorded vote on the amendment.
We're going to have to do this real quick, Madam Clerk. Go ahead.
(Amendment negatived: nays 7; yeas 3 [See Minutes of Proceedings])
The Chair: The amendment was defeated.
I can't do this. I can't take any more.... We have competing committees here tonight. We have finance and OGGO that are dealing with this. The clerk has advised me that I can't....
We're on the main motion. I can't do this, so I'm going to have to—