Skip to main content
;

CHPC Committee Meeting

Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.

For an advanced search, use Publication Search tool.

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

Previous day publication Next day publication
Skip to Document Navigation Skip to Document Content






House of Commons Emblem

Standing Committee on Canadian Heritage


NUMBER 124 
l
1st SESSION 
l
44th PARLIAMENT 

EVIDENCE

Tuesday, June 11, 2024

[Recorded by Electronic Apparatus]

(1655)

[English]

     I'm calling the meeting to order, please.

[Translation]

    Welcome to meeting number 124 of the House of Commons Standing Committee on Canadian Heritage.
    I would like to acknowledge that this meeting is taking place on the unceded traditional territory of the Algonquin Anishinabe peoples.

[English]

    Pursuant to Standing Order 108(2) and the motion adopted by the committee on February 14, 2022, the committee is resuming its study of online harms.
    Before we begin, I want to do the usual housekeeping, mostly for the benefit of the visitors.
     Please take note of the following preventative measures in place to protect the health and safety of all participants, including the interpreters.
    Only a black approved earpiece is supposed to be used. The former grey earpieces must no longer be used. Keep your earpiece away from all microphones at all times. When you're not using your earpiece, please place it face down on the decal in front of you. Thanks for your co-operation.
    You're not allowed to take pictures of the screen or of what is going on here. It will be posted publicly later on.
    In accordance with the committee's routine motion, I think our clerk has already made sure the witnesses have completed the required connection tests in advance of the meeting. Thank you.
    I want to make a few comments for the benefit of members and witnesses.
     Please wait until I recognize you by name before speaking. If you're in the room, raise your hand if you wish to speak. If you're appearing virtually, raise your virtual hand. Thank you very much. All comments should be addressed through the chair.
    Pursuant to the motion adopted by the committee on Tuesday, April 9, we have Claude Barraud, a psychotherapist from Homewood Health, in the room with us today. During the meeting, should you feel distressed or uncomfortable due to the sensitive nature of the committee's study, you can speak with Mr. Barraud, who is available to assist.
    I now want to welcome our witnesses.
    Joining us by video conference are Vivek Krishnamurthy, associate professor of law, University of Colorado law school; Emily Laidlaw, associate professor and Canada research chair in cybersecurity law, University of Calgary; Carol Todd, founder and mother, Amanda Todd Legacy Society; and Dianne Lalonde, research and knowledge mobilization specialist, Centre for Research and Education on Violence Against Women and Children.
    In the room, we have, from Connecting to Protect, Dr. Jocelyn Monsma Selby, chair, clinical therapist and researcher specializing in forensic sexology and addiction; and from the Sûreté du Québec, Marc-Antoine Vachon, lieutenant.
    You each have five minutes....
    Monsieur Champoux, please go ahead.
(1700)

[Translation]

    Madam Chair, I would like the members of the committee to agree on one thing before we start the meeting.
    The bells are expected to ring in the next few minutes for us to go vote. So I would like us to make sure that the witnesses will be able to finish their opening remarks, even if the bells are ringing. If there is time for one round of questions, so much the better.
    Out of respect for the witnesses who are testifying as part of our study, we must give them as much time as possible.

[English]

     Thank you, Monsieur Champoux.
     Is everyone in agreement? It's a reasonable request.
     Mrs. Thomas, please go ahead.
    Just to further define that, I wonder if there would be agreement around the table to allow remarks and questions to continue until, let's say, 10 minutes before the vote or five minutes before the vote, if we're willing to vote virtually.
     That's traditional, yes.
     Does anyone disagree with that?
     I'm not disagreeing, but if everyone's voting virtually from here, why can't we all vote and then get back to business?
    We can't according to the rules. Is that right?
    We can't according to the rules. We have to wait until the vote is counted.
    Then I'll just follow what was said to maximize the time.
    Yes. We have to wait until the vote is counted, Michael.
    Go ahead, Taleeb.
    Just to clarify, what Mrs. Thomas is saying, and I think I would be okay with this, is that, if it's a 30-minute bell, we do 20 minutes. People can go and vote and do what they need to do. Then we'd do as custom once the vote is counted.
    There is an option, of course, for you not to go anywhere. I can suspend and you can vote here.
    She's suggesting five or 10 minutes. Do you want five or 10, people? Speak now or I will make a decision, and you may not like it.
    Mrs. Rachael Thomas: I say five.
    The Chair: Five is good. I think five is good, since we're voting virtually.
    An hon. member: [Inaudible—Editor]
    The Chair: Because we cannot go back into business until—
    Some hon. members: Oh, oh!
    The Chair: Guys, can I have one person speaking at a time?
    We cannot go back into business until the votes are counted and read. We could cut short at the beginning. That's fine.
    Okay. That's how we're going to do it.
    Now, I've given the names of all those who will be presenting today. We've put them all into one presentation group as opposed to two separate groups. Again, that's for time.
    I want to apologize to the witnesses. Votes tend to do this, and it disrupts committee a lot. Quite often, people who have come and are waiting to present find themselves unable to do so.
    We will have your presentations now. You each have five minutes to present.
    Mr. Krishnamurthy, we will begin with you. You have five minutes, please.
     I'm very honoured to be here. I apologize in advance that I also have a hard deadline, due to child care obligations, so let me get right to it.
    I'm not an expert on the harms caused by what the committee is studying, that is, exposure to illegal explicit sexual content. The focus of my remarks today will be on the technological means by which this kind of content is distributed and what can be done about it in compliance with the charter.
    Just to frame my remarks, I think we can distinguish between two kinds of material. There's certain material that's per se illegal. Child sexual exploitation material is always illegal, but we face a challenge with material that's what I would call “conditionally illegal”. I think non-consensual distribution of intimate imagery falls into this category, because the illegality depends on whether the distribution is consensual or not—or the creation, for that matter.
     The challenge we face is in regulating the distribution of this content by means of distribution that are general purpose. Take a social media platform, whichever one you want—Instagram, TikTok—or take a messaging platform such as WhatsApp. The problem with regulating the distribution of this content on those platforms is, of course, that we use them for many positive purposes, but they of course can be used for ill as well.
    I'd like to pivot briefly to discuss the online harms act, which is, of course, before Parliament right now and which I think offers a good approach to dealing with one part of the distribution challenge with regard to social media platforms. These are platforms that take content generated by individuals and make them available to a large number of people. I think the framework of this law is quite sensible in that it creates “a duty to act responsibly”, which gets to the systemic problem of how platforms curate and moderate content. The idea here is to reduce the risk that this kind of content does get distributed on these platforms.
     The bill is, in my view, well designed, in that there's also a duty to remove content, especially child sexual exploitation material and non-consensual distribution of intimate imagery, to the extent that platforms' own moderation efforts or user reports flag that content as being unlawful. This is a very sensible approach that I think is very compliant with the charter in its broad strokes.
    The challenge, however, is with the effectiveness of these laws. It's very hard to determine before the fact how effective these are, because of issues with determining both the numerator and the denominator. I don't want to take us too much into mathematical territory, but it's very hard for us to measure the prevalence of this content online or on any given platform. It's just hard to identify, in part because the legality—or not—of the content depends on the conditions in which it's distributed. Then, on the numerator, which is how well the platforms are doing the job of getting it off, again, we have issues with identifying what's in and what's out. This is a step forward, but the bill has limitations.
     One way of understanding the limitations is with an analogy that a friend of mine, Peter Swire, who teaches at Georgia Tech, calls the problem of “elephants and mice”. There are some elephants in the room, which are large, powerful and visible actors. These are your Metas and your TikToks, or even a company like Pornhub, which has a very large and significant presence. These are players that can't hide from the law, but what is difficult in this space is that there are many mice. Mice are small, they're furtive and they reproduce very quickly. They move around in darkness. This law is going to be very difficult to implement with regard to those kinds of actors, the ones that we find on the darker corners of the Internet.
    Again, I think Bill C-63 is a very—
(1705)
    I'm sorry to interrupt. You have 26 seconds to wrap up.
    Very well.
     The only thing I will say to conclude is that Bill C-63 does not deal with messaging software, with things like WhatsApp, which are a primary vector by which this kind of content moves. I think that is a good call, because of the difficulty in doing so. It's something that requires further study, a lot of work and a lot of thought on dealing with that particular piece of the distribution problem.
    Thank you, Madam Chair.
     Thank you very much.
    I'm going to the next person, who is Ms. Emily Laidlaw. Before you begin, I'll let you know that I will give you a 30-second shout-out, so that you can start to wrap up. If you miss some of your presentation, you can elaborate during the question-and-answer period. Thank you.
    It's over to Ms. Laidlaw for five minutes, please.
(1710)
    With my time, I'm going to focus on social media regulation and on Bills C-63 and S-210.
    Social media has historically been lightly regulated. Online safety has only been addressed if companies felt like it or they were pressured by the market. There have been some innovative solutions, and we need them to continue to innovate, but safety has generally taken a back seat to other interests.
    Social media companies have also privately set rules for freedom of expression, privacy and children's rights. There are no minimum standards and no ways to hold companies accountable. That is changing globally. Many jurisdictions have passed online harms legislation. The online harms act, which is part of Bill C-63, aligns with global approaches. In my view, with tweaks, Bill C-63 is the number one avenue to address illegal sexually explicit content and sexual exploitation.
    Bill S-210 would mandate age verification to access sites with sexually explicit material. It is a flawed bill, yes, but more importantly, it is unnecessary for two reasons.
     First, age verification is the crucial next frontier of online safety, but it is about more than sexually explicit material and about child safety broadly. The technology is evolving, and if we are committed to freedom of expression, privacy and cybersecurity, how this technology is used must be scrutinized closely.
     Second, age verification is only one tool in the tool box. A holistic approach is needed whereby safety is considered in product design, content moderation systems and the algorithms. Let me give you a few examples of safety by design that does not involve age verification.
     Child luring and sextortion rates are rising. What steps could social media take? Flag unusual friend requests from strangers and people in distant locations. Remove network expansion prompts whereby friends are recommended based on location and interest. Provide easy-to-use complaints mechanisms. Provide user empowerment tools, like blocking accounts.
     The non-consensual disclosure of intimate images and child sexual abuse material requires immediate action. Does the social media service offer quick takedown mechanisms? Does it follow through with them? Does it flag synthetic media like deepfakes? How usable are the complaints mechanisms?
    For example, Discord has been used to livestream child sexual exploitation content. The Australian e-safety commissioner reported that Discord does not enable in-service reporting of livestreamed abuse. This is an easy fix.
    The last example is that the Canadian child protection centre offers a tool to industry, called Project Arachnid, to proactively detect child sexual abuse material. Should social media companies be using this to detect and remove content?
    In my view, Bill C-63, again with tweaks, is the best avenue to address sexual exploitation generally. I think the focus should be on how to improve that bill. There are many reasons for that. I'll give two here.
    First, the bill imposes three different types of responsibility. Vivek discussed this. Notably, the strongest obligation is the power of the commissioner to order the removal of child sexual abuse content and non-consensual disclosure of intimate images. This recognizes the need for the swift removal of the worst kinds of content.
    Second, all of this would be overseen by a digital safety commission, ombudsperson and office. Courts are never going to be fast to resolve the kinds of disputes here, and they're costly. The power of the commissioner to order the removal of the worst forms of content is crucial to providing access to justice.
     Courts are just ill-suited to oversee safety by design as well, which is necessarily an iterative process between the commission and companies. The tech evolves, and so do the harm and the solutions.
    With my remaining time, I want to flag one challenge before I close, which Vivek mentioned as well. That is private messaging. Bill C-63 does not tackle private messaging. This is a logical decision; otherwise, it opens a can of worms.
    Many of the harms explored here happen on private messaging. The key here is not to undermine privacy and cybersecurity protections. One way to bring private messaging into the bill and avoid undermining these protections is to impose safety obligations on the things that surround private messaging. I've mentioned many, such as complaints mechanisms, suspicious friend requests and so on.
    Thank you for your time. I welcome questions.
(1715)
     Thank you very much.
    Now I go to Carol Todd from the Amanda Todd Legacy Society for five minutes, please.
    I'd like to thank the committee for inviting me to speak. It's an honour to be able to share knowledge.
    I'm not coming as a researcher or someone who has studied this. I'm coming as a mom, and I'm coming as a parent and as an educator with lived experience, so confining my conversation to five minutes was difficult. I've written some notes that I will read until my time is up, and I do welcome questions at the end.
     I have spent the last 12 years, I guess, looking at learning about sexual exploitation and online behaviours, and it is really hard to imagine the horrid things that are happening out there to our children. As a side note, I believe that Bill C-63 needs to be passed with some tweaks, because it is the safety net for our children and Canadians online.
    This subject holds significant importance and warrants ongoing dialogue to tackle not just the ease of access to such material but also the profound harm that can be inflicted upon those who encounter sexually explicit content every day.
    I am Carol Todd, widely known as Amanda Todd's mother. In addition, I am an educator in a British Columbia school district with my work primarily centred on digital literacy, online safety and child abuse prevention with a focus on exploitation and sextortion.
    Empowering students, teachers and families with the knowledge and skills to navigate the digital world safely is essential, important and now a passion of mine. I will continue to talk forever about how we can keep families and children safe, because this is what we needed for my daughter, and it came a bit too late.
    Amanda tragically took her life on October 10, 2012, following extensive online exploitation, tormenting harassment and cyber-abuse. Her story very much relates to what happens when there is creation, possession and distribution of sexually explicit material online and how easily others can access it as it becomes embedded online forever.
    Amanda's story garnered global attention after her tragic death. To reclaim her voice while she was alive, Amanda created a video that she shared on YouTube five weeks before her passing. It has been viewed 50 million times worldwide and is now used as a learning tool for others to start the discussion and for students to learn more about what happened to her and why it's so important that we continue to talk about online safety, exploitation and sextortion.
    As another side note, it has taken forever for us to catch up on the conversation of exploitation and sextortion. It was something that no one was able to talk about 12 years ago, in 2012. It has evolved because of the increase of exploitation and sextortion online, not only happening to young girls, young boys and young adults but men and women. The nefarious offenders online, because they've gotten away with it due to so many levels of the Internet these days, have increased in numbers and have caused much trauma and much harm, as this is a form of abuse and violence.
    Over the past decade, we've observed rapid changes in the technology landscape. Technology primarily used to be used as a communication tool for email, and now we have seen the evolvement of applications for fun. They were explained as safe, but now we know differently, because they have increased the chaos, concern and undesirable behaviours online for Canadians and for all.
    This isn't just a Canadian problem. It's a global problem, and I have watched other countries create legislation, laws and safety commissions, just as Canada, with Bill C-63, now wants an e-safety commissioner board, and I think this is a brilliant idea. For anyone here who gets to vote, I hope that it does pass.
    The prevalence of sexually explicit material has markedly increased—
(1720)
     Can wrap up, please, Ms. Todd?
    Can I make a suggestion, Chair, that we extend the witness's testimony? Can we just give her two or three more minutes, if that's okay with everyone?
     Thank you.
     Yes, all right. I see the lights flashing, so the bells have started.
    The committee wishes to give you two more minutes, Ms. Todd.
     Thank you.
    The prevalence of sexually explicit material has increased due to the widespread use of the Internet. It manifests in various forms, including visual representations, photos, videos, films, written content, audio recordings and print material. The volume grows exponentially day by day. The protection that we have for our children and for our adults isn't there on the Internet. Big tech companies need to take responsibility. I know that throughout the world now, there are more and more lawsuits where big tech companies are being held responsible.
    When accessing sexually explicit material, some of the challenges that we are faced with include access to violent and explicit content that can impact sexual attitudes and behaviours, the harm to children through the creation, sharing and viewing of sexual abuse material, increased violence against women and girls, as well as sex trafficking. It can also influence men's views on women and relationships.
     In my notes, I comment that we stereotype often that it is men who are violating others, but the offenders can be men and they can be women. They can also be other children—peer violence to peer violence. There is no one set rule on who is creating and who is causing, but we know that those who become traumatized and victimized can be anyone.
    What more needs to be done? I'll just go through this quickly.
    As an educator, I feel strongly that increasing education is crucial. The awareness and education needs to go to our children and our young adults and to our families.
    We need stronger regulations and laws. Bill C-63 is one of them. I know that in the province of B.C., more legislation has been passed and is done.
    We need to improve our online platforms and make them accountable. We need to increase parental controls and monitoring, and we need to encourage reporting.
    We also need to promote positive online behaviours. Social emotional learning and social responsibility are part of the awareness and the education that needs to come on.
    We need to be a voice. We need to stand up, and we also need to do more.
    Thank you for the time, and I encourage questions so that I can finish reading my notes.
    Thank you.
     Thank you, Ms. Todd. Thank you very much.
    I'm sorry. Can I just have a brief—
    The bells are going I think.
    Mr. Philip Lawrence: We're going until 5:30 or five minutes—
    The Chair: Excuse me, Mr. Lawrence. I just said that the bells are going, so I need to check how much time we have to go before we get to our five minutes before we vote.
    Go ahead, Mr. Lawrence.
    First of all, I just want to commend all the witnesses for being here. I'm wondering, given their courage and the unbelievable testimony that they're bringing, if there is any way we could extend to seven o'clock tonight.
     I don't think we have the resources to go beyond 6:30. I'm sorry. I was told that at the beginning of the meeting.
    Mrs. Thomas.
    Thank you.
    Could you just check with the clerk? My understanding is that we have resources that actually could take us potentially to eight o'clock.
    I'm sorry. I was told by the powers that be that we only have until 6:30 as a hard stop.
    If you could confer with the clerk, that would be great.
    Clerk, do we 6:30 as a hard stop?
    The hard stop we were given was 6:30 p.m. However, if the committee wishes, I can request additional time. I can't guarantee it will be accepted, but we can ask the resources if they have availability.
(1725)
    Thank you.
    How many minutes do we have before the vote? Is somebody keeping track?
    The Clerk: We have 25 minutes.
    The Chair: We will go for another 20 minutes.
    I will go to Dianne Lalonde from the Centre for Research and Education on Violence Against Women and Children for five minutes, please.
     My perspective is informed by my work with survivors in the gender-based violence sector, and I will focus on the need for a gender-based analysis when we're talking about online harms and legislation.
    Specifically, I'm going to focus on two online harms—the non-consensual distribution of intimate images, which I refer to as NCIID, and then also deepfake sexual abuse—although I'm happy to speak more to further forms that haven't been necessarily brought forth as much, such as cyberflashing.
    Each of these forms of violence are increasing in the Canadian context. They target marginalized individuals, and they produce gendered and intersectional harms. When we're talking about the non-consensual distribution of intimate images, violence occurs when individuals have their content taken from even their computers that were private, but also posted online....
    People do so for a variety of motivations, many of which link into other forms of violence. They do so to control, monitor and harass their current or past intimate partner. As well, we see especially young boys doing so, because of social pressures they face relating to traditional masculinity and expectations around sexual experience—that they should have this experience and that they should be promoting it.
    We have also seen NCIID used as a tactic to advertise, recruit and maintain control over individuals who experience sex trafficking. NCIID does disproportionately target women. Out of the 295 Canadian cases of NCIID reported to police by adults in 2016, 92% were reported by women. Police-reported incidents, from 2015 to 2020, by youth 12 to 17, found girls, again, overrepresented as targets, at 86%, in comparison to boys at 11%.
    Unfortunately, we are lacking intersectional Canadian data, but if we look at national studies in America and Australia, we see that they share that NCIID also disproportionately targets Black, indigenous and 2SLGBTQ2IA+ individuals, and people with disabilities.
    We see very much the same targeting when we're talking about deepfake sexual abuse. Many of these forms of applications and technology only work on women's and girls' bodies. A study of 95,000 deepfake videos in 2023 found that 98% were sexually explicit, and of those, 99% targeted women.
    When we're talking about the impacts, as you can imagine they are vast. They are emotional, economic, physical and social. Survivors have likened these forms of violence to additional forms of sexual violence wherein their autonomy is denied. They have also shared that one thing that's distinct about online harms is the way in which the harm becomes crowdsourced, and people are sharing this violent experience.
    Technology-facilitated violence impacts different groups in qualitatively specific and intersecting ways. For instance, sexual double standards result in women in comparison to men being more likely to be blamed, discredited and stigmatized due to sexual imagery online. The 2SLGBTQ2IA+ individuals have identified that NCIID has been a tool to “out” their sexual orientation and their gender identity. Finally, deepfake sexual abuse also impacts sex workers, especially women and sex workers who have their likenesses stolen and used to inflict violence, and who then face stigma and criminalization in response.
    In terms of ways to address this harm, I think much of the focus on legislation has been on regulation and removal of content, and that is absolutely essential. We also need to recognize the people this is impacting, the survivors and who survivors are going to. They are going to gender-based violence services in order to cope and heal from these harms. An added dimension when we're talking about addressing online harms is making sure we're supporting the gender-based violence agencies that are doing the work to support survivors who already have robust sex education programs.
    Some of this work is also outlined in the national action plan to end gender-based violence.
    As well, I want to echo Carol Todd's remarks about the importance of consent-based education, especially when we're talking about deepfake sexual abuse. Sometimes there's not an understanding of it as a form of harm, so we need to have education in schools and in society that is sex-positive and trauma-informed, to share that this is a form of violence and also to fight against victim blaming.
    Thank you.
(1730)
    Thank you.
    You can elaborate, as I said to all witnesses, during the Q and A period.
    We now go to Dr. Selby for five minutes, please.
    My submission to you comes from 43 years of clinical practice and research and from chairing Connecting to Protect's global summit in 2022, which involved 23 countries addressing harms stemming from children accessing pornography online.
    My experience links me directly to the consequences of childhood access to online pornography, which results in problematic sexual behaviour, including difficulties in conducting relationships, destruction of the family and, in more extreme cases, criminal behaviour. Access to pornography by children who are unable to process and understand the material is like a gateway drug, setting up future abuse and all the attendant consequences.
     For the last 13 years, I've treated individuals with compulsive sexual behaviour disorder and individuals who've been accessing child sexual exploitation material online. We are facing a global epidemic of online child sexual abuse and exploitation as a result of unregulated access to the Internet. We're getting it wrong and we're missing the mark in protecting children.
    My colleague and I have outlined in detail what we consider to be the proposed solution in our brief for Bill S-210. We simply advocate shifting the narrative from the focus on age verification to a broader consideration of age assurance options, in conjunction with device-level controls operating at the point of online access through Google, Apple or Microsoft. This approach is technologically possible and relatively quick to implement, with far greater reach and effectiveness. Device-level controls coupled with a multi-dimensional public health approach are needed, including the implementation of protective legislation and policy.
    Sadly, sexual exploitation is happening right now in Canada, feeding the production of illegal sexually explicit material online. Cybertip.ca receives millions of reports of child sexual exploitation material yearly, while 39% of luring attempts reported to Cybertip.ca in the last several years involved victims under 13 years of age. Globally, from 2020 to 2022, WeProtect's global threat assessment—and I hope you're sitting down for this—found a 360% increase in self-generated sexual imagery of seven to 10-year-olds.
    How does this happen? It is wrong on so many levels. There is not a child protection expert on the planet who agrees that this is okay. It's child sexual abuse via digital images.
    The harms to children due to accessing legal and illegal sexually explicit material online include trauma, exploitation, self-produced sexual images, child-on-child abuse, objectification, violence, risky sexual behaviours, depression, difficulties in forming and maintaining close relationships, anxiety disorder, panic attacks, PTSD and complex PTSD symptoms, among others. Potential health issues and addiction carry on into adulthood, causing documented long-term mental health consequences that impact personal and family relationships and the very fabric of our society, unless there is early identification and treatment of the problem.
     You might be wondering how certain individuals are vulnerable to developing a problem like this or a compulsive sexual behaviour disorder. It almost always involves access to legal sexually explicit material online at an early age. The average age of exposure is 12 years old.
     I want to talk to you about the erototoxic implications of sexually explicit material online. We know we need to do something—
(1735)
    Dr. Selby, you only have 16 seconds left. I'm so sorry. You can elaborate when we get to questions and answers, and you can expand—
    I think we have unanimous consent—
    —on some of the things you want to say.
    —to give her another minute.
    I am sorry. We have a certain time with votes, Philip. Mrs. Thomas suggested five minutes before the vote. We have time for Ms. Selby to finish and for one more person in order to get all of our witnesses done before we vote.
    I'm sorry, Dr. Selby.
     On a point of order, Madam Chair, we actually don't. We have until 5:39, do we not? That leaves three more minutes.
    We have 13 minutes and 29 seconds before we vote. That leaves room for one five-minute presenter—that's the last witness—and then our five minutes before we vote.
    Thank you.
    Dr. Selby, I'm very sorry, but you will have a chance to elaborate during the question period.
    I will now go to Marc-Antoine Vachon—
    I have a point of order.
    The Chair: Go ahead, Ms. Thomas.
    Mrs. Rachael Thomas: I'm sorry, no; you just robbed her of her last 30 seconds. Please give that back to the witness.
    Do you mean the last 12 seconds? All right.
    No, it's 30 seconds, because you continuously interrupted her.
    I am here to keep the time, but go ahead.
    You have 30 seconds, Dr. Selby.
    When I began training in this area, common wisdom said that it would take 15 to 20 years from accessing problematic sexual content on the Internet before individuals realized they had a problem. I'm now seeing 17- to 19-year-olds in treatment.
    I'm an independent third party voice and do not receive compensation from any organization, including the adult industry, for my work in advocating for the protection of children.
    Thank you for hearing me out today.
    Thank you, Dr. Selby. I'm sorry, but I have to keep to the time we have decided on as a committee.
    We will now hear from Marc-Antoine Vachon, a lieutenant with the Sûreté du Québec.
    Mr. Vachon, you have five minutes.

[Translation]

    I would like to begin by greeting all the members of the committee and thanking them for giving me the opportunity to speak to them here today.
    My name is Marc‑Antoine Vachon. I have been in charge of the Internet child sexual exploitation investigation division at the Sûreté du Québec since 2020. I have spent most of my career working on and fighting crimes of a sexual nature. I have been mainly involved in this fight since 2006.
    Sexual violence, especially when the crime is against a minor, remains a concern for the public, police organizations and government authorities.
    Unfortunately, it is clear that the number of reports of child pornography is constantly on the rise. Since 2019, at the Sûreté du Québec, we have noted a 295% increase in the number of reports received and processed, from 1,137 to 4,493. Police organizations need to adapt to changing behaviours of child pornography users, as well as constantly evolving technologies.
    As a provincial police force, the Sûreté du Québec is responsible for coordinating investigations of sexual crimes against children committed over the Internet or using electronic equipment, including child pornography, sextortion and luring.
    The fight against sexual violence has been at the heart of our priorities for many years. In this regard, since 2012, the Sûreté du Québec has been pursuing its provincial strategy against the sexual exploitation of children on the Internet, which focuses on prevention, technology, training, coordination and repression.
    Over the past five years, we have been able to improve this structure thanks to additional funding from the federal and provincial governments. The results clearly show the usefulness of this funding.
    In concrete terms, three specialized investigative teams and a coordination team are dedicated specifically to countering this phenomenon by ensuring that reports are processed; conducting and coordinating investigations involving sexual predators and consumers of child pornography at the provincial level; partnering with various reporting agencies such as Cybertip and the National Child Exploitation Crime Centre, located here in Ottawa and managed by the Royal Canadian Mounted Police; providing operational expertise to various internal and external requesters, including municipal police forces and the Director of Criminal and Penal Prosecutions; identifying and stopping predator networks distributing child pornography; identifying, through various means of investigation, individuals who produce and make available child pornography on the territory.
    The implementation of the integrated child pornography enforcement team in October 2021 has contributed to a more effective fight against the sexual exploitation of children on the Internet. This team, made up of members from the Sûreté du Québec, as well as from municipal police forces in Montreal, Quebec City, Laval, Longueuil and Gatineau, conducts joint operations mainly aimed at producers and distributors of child pornography.
    In the fall of 2023, this team coordinated an interprovincial operation with our colleagues in Ontario and New Brunswick as part of the “Respecter” project.
    Completing the project's various investigations, mainly aimed at identifying the distributors of child pornography in these areas, required the participation of 470 police officers and 31 police organizations. The coordination done by the Sûreté du Québec now makes it possible to strengthen our capacity to take action against the sexual exploitation of children; synergize law enforcement efforts against a global problem; optimize case management when there is a potential situation of child exploitation; identify more sexual predators on the Internet; increase the number of arrests; and, of course, prevent potential minor victims.
    The Sûreté du Québec is actively working to combat the access to, and the production, possession and distribution of child pornography in order to protect the most vulnerable and bring sexual predators to justice. Since 2019, efforts have made it possible to arrest more than 1,100 individuals and identify more than 230 real victims on our territory, the province of Quebec.
    I want to emphasize that the fight against the sexual exploitation of children on the Internet is possible thanks to the participation and collaboration of all actors and partners who care about the well-being of victims and who take relevant action, both police officers and citizens—
(1740)

[English]

     You have 30 seconds, Lieutenant Vachon.

[Translation]

    —who are invited to report any situation of sexual abuse to the police.
    Thank you for your attention.
    I look forward to our discussion during the question and answer period.

[English]

     Thank you very much.
    We now have seven minutes and 14 seconds before the vote. That comes as close to the five minutes as we can get, so I will suspend and we will try to get the House online for you to look at what's going on in the House.
    We are suspended.
(1740)

(1800)
     The clerk would like to report back to you on your request that she find out about resources.
    Our partners have confirmed that, should the committee wish, we do have resources available until 7 p.m.
    I shall now begin.
    Go ahead, Mr. Lawrence.
    I just have a quick comment. We'll wait for Taleeb to come back if it were to go to a vote, but I'm hoping we don't have to do that.
    Mr. Champoux from the Bloc had a brilliant suggestion that we just allow for enough time to do one full round, which would take us to between 6:30 and 7:00. We'll go ahead and then we can discuss it as we go.
     One full round is 24 minutes.
    We'll go with the six-minute round, so I'll begin with Mrs. Thomas for the Conservatives.
    I have a point of order.

[Translation]

    Madam Chair, I would like to clarify something.
    I completely agree that we should do a full round of questions and comments, but we are starting a little earlier than I had expected. If, at the end of this round—
(1805)

[English]

    The interpretation's not working.
    There's no interpretation. Can we please have interpretation?
    I'm sorry, Martin. You'll have to repeat yourself—
    I'm sorry to interrupt you. There wasn't English—
     Michael will have to stop speaking, because we're going to—
    Madam Chair, I couldn't hear what he was saying.
     Martin had the floor.
     I couldn't hear anything he was saying.
     If you were listening, you would have heard that I asked for interpretation and he can repeat himself. I'm waiting for interpretation.
    Go ahead, Martin. Begin again.

[Translation]

    Madam Chair, as my colleague Mr. Lawrence said, I completely agree with doing a full round, but we are starting with the questions a little earlier than I had expected. So I propose that we do the first round and, if there is time after that, I would be more than willing to have a fair allocation of speaking time among the parties for a second round.

[English]

     Thank you.
     Mrs. Thomas is next.
     I officially move a motion that we sit until 7 p.m.
    It's a non-dilatory motion, so it needs to go to a vote.
     We have a vote.
    Mr. Coteau, we have a motion on the floor.
     Madam Chair, we've always—
    I have a point of order.
    It's not debatable, Michael.
    I have a point of order.
    Go ahead with your point of order.
    Madam Chair, I'm sorry—
     We've always worked off of consensus on timing. We can't go past 6:30. There are folks who have commitments and it's impossible for us to do so.
    Mr. Coteau, that's debate. I'm sorry, but that wasn't a point of order.
    Let's go to the vote, please.
    The vote is on the motion from Mrs. Thomas to sit until 7 p.m.
     I'm just checking to see if Mr. Noormohamed has joined us.
     Chair, can I make a point of order?
    First of all, we just had an agreement that we wouldn't move any votes until my colleague came back. Mr. Lawrence clearly said that.
    Number two, I've always been under the impression that we had to have a consensus in order to move forward with an extension of time. That's been the rule that we've always worked from. I don't think it's fair to accept this motion because it's out of this—
    On a point of order, this is dilatory. There's no choice.
     The Conservatives were the ones who actually said to not make any votes until we can accommodate—
     I am sorry, Michael. You're debating the motion to go to until 7:00.
    Let's call the vote. It's a non-debatable motion.
    Call the vote, please, Clerk.
    Is Mr. Noormohamed here virtually now ? There he is. He can vote.
     (Motion agreed to: yeas 6; nays 5)
    The Chair: The vote to sit until 7:00 has passed.
    Go ahead, Mr. Coteau.
     Now I can bring up the point I was trying to bring up earlier.
    Hasn't it always been the agreement in this committee to have unanimous consent in order to extend time? I remember that as clear as day. That's the way we've done things.
    No, that's not the way. I'm sorry, we just can't debate that, so we have to call the vote.
     Just so we're all clear on the record, from now on, extending time or changing times is based on the majority. Is that all?
    Yes.
    Mr. Michael Coteau: Okay, thank you.
    The Chair: Thank you.
    Mr. Noormohamed, do you have your hand up?
    Yes.
    I have to agree with Mr. Coteau. My recollection is that we have, on many different occasions, sought unanimous consent to sit beyond an allocated time. To do this now on a per-vote basis, I think, is certainly disappointing, given that I know there are four members who have other commitments, and important ones.
    It is not that this is not important; this is incredibly important, but to put members in a situation—
(1810)
    Mr. Noormohamed, your microphone—
    This has not been the practice, and I'm disappointed to see that we're doing this.
     Mr. Noormohamed, the interpreters cannot hear you. Can you turn your microphone on again if you're going to speak?
    I just want to point out that the more we talk, the less time we have for questions.
    Go ahead, Mr. Noormohamed. Can you turn on your microphone?
     My microphone is on. Can the interpreters not hear me?
    We're getting an okay from the interpreters.
    Go ahead, Mr. Noormohamed. You may repeat what you said.
     My point is, very simply, that this is an incredibly important conversation. We have members who want to participate in it but who also have other commitments that will begin after this meeting. It's disappointing that we are changing what has been prevailing practice to now move to a vote for something that could have been done through unanimous consent or with a different solution. It is disappointing that we've just decided to make this decision now.
     Thank you, Mr. Noormohamed. Are you challenging the chair?
    No, I'm just making a point.
     Good.
    I did make the ruling and he's challenging it, but okay.
    Mr. Michael Coteau: He never said he was challenging the chair.
    The Chair: We now have to start the questions.
    The first round begins with the Conservatives for six minutes.
    Mrs. Thomas, please go ahead.
     Thank you.
    Thank you to all of the witnesses for your patience today.
    My first question goes to Ms. Lalonde.
    In an article that you recently wrote with regard to Bill C-63, you said it “contains...glaring gaps that risk leaving women and girls in Canada unfairly exposed.”
    I'm wondering if you can expand on those gaps that you see within the legislation that would perhaps leave women and children vulnerable.
     Yes, I can, for sure.
    In my mind, it's around the support for survivors and how we build that in. I know there's talk of the safety commission and the ombudsperson. Unless they have a mandate to support survivors—which would be new, and they'd have to build relationships with that community—what we need is to have support for the gender-based violence sector to continue uplifting survivors and promoting healing opportunities.
    I think of that especially because the legislation talks so much about regulation, which would require survivors to be the ones who report the violence that they're experiencing. How do we even get to know about the violence unless we're supporting survivors to report it and to heal from it?
    Thank you. There certainly is a need to be survivor-centric.
    I guess I'm further concerned, though, with regard to this legislation. To your point, it sets up a digital safety commission and a digital safety ombudsman, which essentially allows individuals to issue or file a complaint or a concern with these folks. Then, of course, the commission or the ombudsman has the authority to issue fines and to evaluate how platforms are functioning. However, in terms of actual teeth, deepfakes are a real thing, as has been mentioned, and there is no change to the Criminal Code in terms of making it criminal to create and distribute intimate deepfakes online.
    I guess I'm curious if this is something that would be worth the government pursuing on behalf of Canadians and ensuring that vulnerable individuals are better cared for.
    We've certainly seen success in the U.K. in terms of their criminalization of distribution, so that does remain important. One of the biggest websites of deepfake sexual abuse was taken down in the U.K. after that decision was made.
    I think the creation piece is another area where we could stop the violence early on if people realized the ramifications before they started. That's another avenue that could certainly be explored.
     In your estimation, should there be an expansion of the Criminal Code to include intimate deepfakes or the abuse that transpires based on those?
     Yes, I think so. I think, more than anything, that it would signal this as a form of violence, given how much it is doubted. The study in 2023 that I mentioned assessed 95,000 different deepfake sexual abuse videos and also asked people, “Do you feel guilty watching these?”, and overwhelmingly people said no. There needs to be a signalling of this as a form of violence in recognition of that.
    Criminalization is important, but in addition to that, it's also education, especially with youth. We're seeing young people increasingly use this technology that could be harmful in a way that perhaps immediately goes to concerns about criminalization and shame. In a way, that promotes sex positivity and sexual expression, but it shares how harms could be produced.
(1815)
    You have also stated, with regard to this bill, that “Technology-facilitated violence that Canadians have been increasingly experiencing does not impact everyone in the same way and at the same rates”. You're tapping into this idea that there is technology-facilitated violence. How could that be better addressed through legislation?
     Even when we're talking about a digital safety commission, we're talking about raising digital awareness. It's also getting to the root causes, so yes, we need to talk about digital literacy, but we also need to talk about misogyny. We need to talk about gender equity. These are all very interconnected issues, especially when we're talking about these forms of violence that so disproportionately target women and girls.
     Okay. Thank you. I take your point, for sure.
    My next question is for Dr. Selby.
    Can you talk a little bit about platforms—you touched on this a little bit in your opening remarks—in terms of strategies around how platforms function and the vulnerabilities that are created for children online? What could be done from a legislative standpoint?
    I understand that there's a lot of education needed, for sure, but in this room we're legislators. What could be done on that level in order to protect children on those platforms?
    There are 30 seconds left.
    Children are finding sexually explicit content, illegal content, on many platforms like Wikipedia and TikTok—you name it. It's just fascinating where they're finding this material.
    A device-level control with age-assurance technology is the only answer, because you're not just finding this material on porn sites; it's everywhere. The places you can find it will blow your mind.
     What mechanisms could be put in place?
    Thank you, Ms. Thomas. Thank you very much.
    I just want to inform the committee that Mr. Krishnamurthy is back, if you wish to question him. He had left, but he's come back.
    Next is Patricia Lattanzio for six minutes, please.
    Thank you, Madam Chair. I'd like to thank everybody for being here today.
    I'm glad that Mr. Krishnamurthy is back, because I'd like to address the following question to him.
    There is no question that everyone present here and engaged in this important discussion in other forums, both nationally and abroad, perhaps wants one outcome above all others, and that is to ensure that children are interacting free from threat or violation when online. Also, I think that we all agree on the importance of outlining and establishing safety standards to see that this is achieved. The issue, however, seems not in the “what” but in the “how”.
    Mandatory age verification technology, as we know, is still in the early stages, not just in its refinement but also in its conception. It's an important element of the conversation, but it has been tried and it has failed in multiple U.S. states, such as Louisiana, Utah and Texas. As you likely know, VPNs, or virtual private networks, exist as a means of circumventing both the age restriction controls for children and the means of tracking offenders. This is an important nuance to consider. These tools can significantly hamper the utility of existing age verification technology as they relate to offenders and victims.
    Here is the question: How does age-appropriate design and language seek to address these issues and improve the safety standards for children?
(1820)
     Thank you very much for that question, which is a difficult one to answer, but let me try my best to speak to some of the points you raise.
     I'm intimately familiar with the age verification laws that have been enacted in the United States and some other jurisdictions. My view is that these laws are ineffective not just because of the technological points that you raised in the question, but also because we have a fundamental problem, which is that this technology is not mature and poses many risks to the privacy and cybersecurity of individuals who are using the Internet. This is a solution that, as technology develops, may be better. This is why I believe that the Senate bill is ill-considered at this time.
    As to age-appropriate design as a different concept, the idea of age-appropriate design is that websites or other online services that cater to children, or that are likely to have young people as users, must incorporate, by design, various kinds of protections to ensure that they are used in an appropriate manner or that they are protected from misuse.
     I think this is a very important set of interventions that get to the previous discussion that was had with Ms. Thomas, which was regarding the prevalence of this kind of harmful content in many parts of the Internet. The idea here is, again, to reduce the prevalence of that content by making sure that sites that appeal to children or that are likely to be used by children have measures in place to keep them safe.
    I think the larger point that I'd like to make here is that this is a very complex set of problems that have no single legislative or technological solution. We're going to need different points of intervention that regulate different kinds of players in the technology ecosystem, as well as people who use technology, if we're going to effectively deal with the problem.
     I also think that it's important to understand that we're never going to achieve a 100% solution to any of these problems. The problem of sexual exploitation of children or of the circulation of unlawful intimate imagery certainly predates the Internet. It will probably postdate the Internet in terms of what technology comes next, so what we should be looking for are solutions that are significantly effective in reducing the prevalence of this content and the harm it causes.
     My next question is addressed to Ms. Laidlaw.
     Bill C-63 was developed to ensure compliance with all existing privacy laws and global best practices. Do you have any concerns related to the privacy implications of Bill S-210? Also, how do we ensure privacy is upheld in the development of online safety regulations?
    Would you like to redirect that, Patricia? Ms. Laidlaw is no longer here.
     Sure.
     Can I address that same question to Mr. Krishnamurthy?
    Yes, I would be happy to answer. I would not speak for Ms. Laidlaw; however, we are friends and colleagues.
    Specifically on age verification technology as it is currently designed, there are a couple of problems. The first is that almost every age verification method that is currently used requires you to divulge personal details, which means that your Internet activities are being tracked by somebody in some way. There are some computational efforts to reduce that, but it is a first-order problem with the technology.
    Another set of approaches uses biometrics—for example, the shape of your face or certain characteristics—to try to determine what your age is, and those approaches suffer from significant inaccuracies. Also, they collect a very sensitive form of information—or at least they process it—which is biometric data.
     There is research under way that seeks to implement age verification in a way that causes fewer privacy harms, but as far as I know, we're not there yet on developing that technology.
    Again, this is an area where there's a lot of innovation spurred by legislation, but I would caution against this technology.
    Thank you.
     The time is up, Ms. Lattanzio. Thank you very much.
     I now go to Mr. Champoux.
     You have six minutes, please, Martin.

[Translation]

    Thank you, Madam Chair.
    I, too, would like to sincerely thank the witnesses for their patience. Sometimes, the vagaries of Parliament mean that we have short waiting periods, such as the one we experienced today because of the votes.
    I would like to speak to Lieutenant Vachon.
    In your opening remarks, you talked about a 295% increase in reports since 2019. That's a dizzying number.
    Please reassure me. Is there a part of this increase that, without being called positive, could be due to the awareness that has been raised and that has led people to decide to report someone or take legal action? I'd like for some of that dramatic increase to be a result of the good work you've done.
(1825)
    You're right to say that. We've heard that a number of times today. Technologies are constantly improving. They are also improving in terms of searching for this type of image on servers.
    We're seeing U.S. electronic service providers, big players like Facebook and Google, reporting more images. We believe it has to do with the fact that their detection methods are better than they used to be. The media are also constantly insisting that complaints must be filed. People are being encouraged to go to police stations, to refer to Cybertip for reporting. That kind of increase is also reflected among those businesses, which are partners.
    Greater ease of access to the Internet will definitely lead to an increase in the number of reports. There are more and more Internet users around the world, including in Quebec. I don't know of a single home where people don't have access to the Internet. This is certainly reflected in the number of users who consume this type of material, but also in the tools that make it possible to detect it and send these reports to police stations.
    So there is a lot of awareness raising and education that we need to do as parents, but also as a society and as legislators.
    Since we're talking about legislation, I can't help but mention Bill C-63, which was recently introduced and which I hope will be considered as quickly as possible.
    Have you had a chance to look at it? If so, what are your impressions of this bill, which may be intended to help you do your job?
    I think it's a very good start. The solutions will never be perfect, as the technology is advancing so quickly, but I think it's a good idea. That bill proposes a good solution. I believe that people who host computer data should be required to know what they are hosting.
    We can sort of see the principle. To draw an analogy, I'll take the example of a convenience store. A convenience store is not allowed to sell firearms. Why would a web host be allowed to host illegal data? There are bots and software that can make it easier for them to detect this type of material.
    Although I haven't read it in its entirety, I believe the bill seeks to require individuals to be aware of the content they host and to report it to the authorities. It's necessary.
    That said, the Canadian Centre for Child Protection also provides free tools to enable these companies to use the centre's bots, as well as files categorized by police officers, and to prevent someone from putting files that are already known as child pornography on a data server.
    In an interview, you yourself said that consumption implies production.
    Of course, there must be more oversight of those who host this content. They need to be accountable. We agree on that. The legislation must be tough enough and serious enough to discourage them from continuing. Without that fear, it's like giving them a slap on the wrist. They get through it, and then they carry on.
    I would also say that it's a question of supply and demand. There is a demand. There is a clientele that is looking for this content.
    Are we being strict enough and effective enough when we catch predators or consumers of this type of pornography, especially child pornography? Are we harsh enough and a deterrent for these people?
    What can we do to take away their desire to search for this type of content, regardless of where it is?
    You're right to mention that. It's the principle of consumption. It's the law of supply and demand. If there is consumption, there is definitely production. We see it in all walks of life, both in criminal circles and in legal businesses.
    There are minimum sentences for child pornography. I think that judges also have a lot of leeway in these situations. We see sentences being imposed based on the crime committed and the evidence from the police investigation.
    What we often see in our searches is a trivialization of these images by both the families and the suspects, in the sense that they will claim that they did not touch a child. Why are they being arrested when, in their opinion, it is just an image? Why would they go to jail, why would they go before a judge, and why would they have a criminal record?
    I think that's what we really need to work on. We have to work to change the mentality of the accused and the families. We often see families protecting the arrested person by claiming that they have not abused anyone. Consuming that image, however, is feeding the person who produces it. For there to be a consumer, there has to be a producer.
(1830)
    Thank you very much, Lieutenant Vachon.

[English]

     Thank you very much.
    I now go to Ms. Ashton. Niki, you have six minutes.
     Thank you, Madam Chair.
    We know that kids in this country are struggling, and we need to do more. Families are crying out for solutions to keep their kids safe. As legislators, we need to be solution-focused to give kids, parents and educators the tools they need to combat online harms. We owe it to the young people of this country to do everything we can to keep them safe. We need to keep kids safe, and we need to get it right when it comes to privacy rights. We understand the importance of safeguarding individuals' sensitive information, especially in the digital age, when privacy breaches can have severe and—as we've heard from some of our witnesses—sometimes fatal consequences.
    I'd like to direct my questions to Ms. Todd.
    I want to thank you very much for appearing before this committee. You are a leading voice in this country. I want to thank you for sharing what is obviously a very heartbreaking experience—what your daughter went through and what your family has gone through—and for the way you've been able to move that into advocacy and social change.
    I also remember that a number of years ago, a former colleague, MP Dany Morin, was working to develop a national bullying prevention strategy. I remember that he consulted with you on it. I recognize this and I want to acknowledge that your leadership work has been ongoing for so many years. That was important work. The work you continue to do is important. I am disappointed that the national bullying prevention strategy was voted down by the Conservative government in power at the time. It's clear we need to do better.
    I also want to acknowledge that your daughter Amanda's story affected Canadians from coast to coast to coast. It's tragic, and it highlights to what extent we as legislators have failed to keep young Canadian people safe. We have to do much more in that regard. Amanda's story, which you bravely retold—since no other kid or parent should have to go through what you went through—makes clear how much more work needs to be done. It shouldn't take 12 years following Amanda's bullycide for extradition and a court case. It shouldn't have been the RCMP ignoring a credible tip a year before her death, saying there's nothing that could be done.
    In your mind, looking ahead, what should be done so that cases like what your daughter and others went through would be taken seriously going forward?
     Thank you for your kind words.
    I'm going to be frank. Amanda died in 2012. We are now in 2024. We're almost at 12 years. I've stood up, I've used my voice and I've been an advocate. I've watched what happened in her life and I've talked to many people and organizations around the world. What you do as politicians and legislators is wonderful, but you put up so many roadblocks.
    I'm going to be frank, and I'm not saying this to anyone specifically; I'm saying this generally.
    So many roadblocks get put up by one political party versus another political party. I have sat on six standing committees since 2012, on technology-facilitated violence, on gender-based violence, on exploitation against children and young people, on other ones on intimate images, and now this one.
    I could copy and paste facts that I talk about: more funding, more legislation, more education, more awareness. Standing committees then come out with a report. We see those reports, but we never know what happens at the end: Do these things really happen? Is there more funding in law enforcement for training officers and for their knowledge? Are there changes in legislation?
    Right now we are looking at Bill C-63. I read the news and I look at the points of view. I have someone from the justice minister's office contacting me regularly, because I understand that second reading came up on Bill C-63 last Friday.
    Then you go back to the comments, and all it amounts to is infighting and arguing. Will this bill be passed? Other parties say no, it shouldn't be passed.
    We are harming Canadians, our children and our citizens when things don't get passed. If you look and do your research, you see that other countries have passed legislation similar to Bill C-63. Australia is already in its third or fourth revision of what they passed years ago. I was in Australia last year and I met the e-commissioner. I met law enforcement. I was a keynote speaker at one of their major exploitation conferences. I felt sad because Canada was represented by two officers in Ontario. Canada was so far behind.
    We are a first world country, and our Canadians deserve to be protected. We need to make sure that everyone works on the legislation and on details. It's not just about passing laws: There are different silos. There's the education. There are the kids. There's the community. We all need to get involved. It's not about putting someone in jail because of.... It's about finding solutions that work. As a country, we are not finding those solutions that work right now. We aren't going to find every other predator in the world. Globally today, 750,000 predators are online looking for our children.
    In my case, Amanda's predator came from the Netherlands. It's not just about one country, because the Internet is invisible fibres. We know that Nigeria has exploitation—
(1835)
     Thank you, Niki. Your time is now over.
    Now I'll go to a second round. This round is going to take a total of 25 minutes. If we are really rigid about timing, we could do five minutes, five minutes, two and a half minutes, two and a half minutes, five minutes, and five minutes, or we could shorten that if you like.
    If we're going to go with the second round, Mrs. Thomas, you have five minutes.
     Ms. Selby, I'm going to come back to you for a moment.
    With regard to where we left off in terms of platforms, are there legislative changes that could be made in order to better protect children on platforms?
     We need to get back to a type of protection that will be fit for purpose for the majority of individuals online, including children. There's always a perfect storm of events with individuals that allows them to be vulnerable to whatever they're looking at.
    In terms of the tools you pick, if you do a device-level control that has an age assurance technology, then you'll get the majority of platforms. If you just cherry-pick who you're going to go after, such as a porn site or someone like TikTok, what are you doing? If you have a device-level control, anybody who buys that device in Canada has to implement an age assurance technology that you approve as a government. All these tools get vetted. Then you pick what you think, at that moment in time, is the one that should be used.
    You can do that, but you need to have a tool at the device level that hits all of these sites. There are so many different ones. People are finding explicit sexual content all over the place on the regular Internet, not on the dark web. They're finding it on the regular Internet.
    I hope that answer helps.
(1840)
     I think it does, to some extent. I guess I'm just curious, though; it feels like the responsibility of platforms is somewhat being dismissed by your answer. It's not that you're intending to do that, but I just want to tap into this a little bit further.
    Should platforms not have a duty of care when it comes to—
     How do you regulate platforms? Are you going to pick one and not another?
    Why would you have to pick one and not the other?
    The problem is that there are so many platforms that offer material that is problematic. How do you decide who you're going to implement the technology with?
     I think that's up to legislators to determine.
    I think that is a huge challenge for you. That is a huge challenge.
    Look at what's going on in Texas and all these other states. They've cherry-picked who they're implementing an age verification technology with and they've also cherry-picked which technology they're using. As we've heard today, many of them have not been proven effective or do not actually protect what they say they're going to protect. We have some issues here. That's why we go to the device level, as it just gets everything. You can't turn on your phone without—
    I'm sorry. I have to make sure that my colleague gets some time here.
    I'll switch it over to Kevin.
     Thank you, Ms. Thomas.
     I sat and listened to debate on Bill C-63 Friday. There was, I think, a high school class watching from the gallery. It was kind of interesting, because as Bill C-63 was debated—and I give the teacher a lot of credit—the government had their statement and the opposition had their statements, and there's a trade-off between a guarantee of their security and their Charter of Rights. We have seen that in many of these bills.
     Ms. Selby, what would your recommendation be to those high school students? Many of them are just coming into the adult world. What would your recommendation be on the Charter of Rights and their security around sexual exploitation?
    Well, that is an interesting question. I don't know if many of you are aware of this, but if you go to the United Nations rights of the child, there is an optional protocol to the Convention on the Rights of the Child on a communications procedure that was ratified by 41 nations. That protocol gives individuals a right to file a complaint with the committee alleging a violation of their rights under the convention. That includes, under numerous provisions, a failure to protect the child from harmful exposure to pornography.
    Therefore, kids can actually file a complaint with the United Nations, if they decide kids are children. If they decide—
    Thank you, Dr. Selby. Your time is up.
    I'll move to Mr. Coteau for five minutes.
     We're going to split our time. I'll take two and a half minutes and then pass it over.
     I want to thank all the witnesses for being here today. This is, of course, a very important issue to all of us in the room here. I want to say thank you especially to Carol Todd. Thank you for sharing your story.
     I want to ask a question of Mr. Vachon.
    Sir, you said there was a large increase from 2019. I think you said it was about 300%. You went from 1,000-plus cases to 4,000 cases.
     Do you have an understanding of why that increase has happened so drastically over the last few years?

[Translation]

    I think the detection methods that digital service providers use are improving. Artificial intelligence is used not only for nefarious purposes but also for good reasons.
    Moreover, prevention efforts in schools also encourage children, teens and adults to report that type of crime. They can go to the police or do so anonymously on special lines such as through the Sûreté du Québec's criminal information centre. So people are being encouraged to report more of those crimes. I also think that, over time, the police are improving the support they give victims and their way of dealing with them.
    These prevention efforts and improved techniques are increasing the number of reports made and processed by police officers.
(1845)

[English]

     I'd like a really short answer to this question.
    Because you're on the front line, is there a way in which you take that knowledge you have and transfer it to those who work with young people in order to put in preventive measures?
     I have only about 30 seconds left, and then I have to pass the floor to my colleague.
     How do we capture that information and transfer it?

[Translation]

    Police organizations have prevention teams that we work with often to create guides. In Quebec, a continuum of prevention efforts begins in grade one to help children learn right from wrong, what can be done and what must not be done, which leads to incidents being reported. The issue of consent and so forth is also discussed. The prevention team is not part of my division, but we work hand in hand with them and help them strengthen their tools.

[English]

    Thank you so much for your work.

[Translation]

[English]

    Thank you, as well, to all the witnesses for being here.

[Translation]

    Mr. Vachon, I also have a question for you.
    Ideally, images of the sexual abuse of children would never be published online, of course. Bill C‑63 includes provisions requiring removal of material within 24 hours.
    I'd like to know what you think of that tool proposed in the act. Further, are there other tools that could improve this bill or that we should consider?
    Yes, that has to be done. Giving a time limit for mandatory removal is a very good idea. In many cases, companies do not respond, especially if they aren't Canadian. We have seen that in the case of removal requests relating to complaints made through the Canadian Centre for Child Protection website, replies are not always received. So setting a time limit is important, but I think it is also important to impose fines to penalize service providers that do not comply with the time limit.
    As to the removal of online images, the international policing community feeds information into a database. Each file has a digital footprint similar to a digital fingerprint. So there is a database that receives information from the international policing community, and that database is made available to the National Centre Against Child Exploitation in Ottawa, as well as the Canadian Centre for Child Protection. They use the database and other technologies and use bots to search websites and identify images of child pornography.
    Those organizations offer free tools to companies. Requiring Canadian companies to use the tools of the Canadian Centre for Child Protection would therefore be a very good idea, in my humble opinion, because it would help remove content already stored on servers and stop things from being added that are already on the servers. That would block the material at the source. It would not be perfect because there are always images circulating and there will always be producers, but at least it would remove a good chunk of the material.
    There are a number of other sites available to victims to have images removed, but unfortunately there is no monitoring of the effectiveness of those sites.
    That's a good suggestion.

[English]

    You have 18 seconds.

[Translation]

    Did you say it's an international database?
    Yes, it's international. It includes all policing communities.

[English]

     Thank you.
     I now go to Monsieur Champoux for two and a half minutes, please.

[Translation]

    Thank you. Two and a half minutes, that's not much time.
    I'm glad you already answered the questions I would have asked in my short turn.
    I have a 13-year-old daughter and I sense that the message about being careful does not always reach young people that clearly.
    I suppose it is primarily the consumers that you arrest. Do you think the message is getting out more clearly? Are there things we could improve, not only as parents but as a society? I'm thinking of how to make girls who are 13, 14 or 15 years old realize that it isn't cool to be photographed or videoed by their friends who in turn do whatever they want with it.
    Have we got the right approach now? What could we do better?
    Honestly, yes, I think we have the right approach.
    Let me tell you about Quebec. Our prevention work is effective, as I said earlier. The policing community is changing more and more. In Quebec, when we were young, William Bumbray would show up with his big moustache. I see policing communities adjusting to their clientele. The Sûreté du Québec now makes videos that are posted online, on YouTube. We are on social networks.
    That is how we can reach teens. We maintain our presence on social media, adapting our prevention message according to age, make it funny and a bit lighter rather than simply saying not to do this or that. That is where the Sûreté du Québec is at in terms of the videos we publish.
    So prevention is important. As I said earlier, the prevention continuum starts in grade one. We have to hammer home the prevention message, along with the potential lifelong consequences, because there are lifelong consequences. I think that is how we can reach young people today. There will of course always be people who downplay things.
(1850)
    It's a lucrative industry with a lot of customers. Demand is high. I think you have answered all our questions about the coercive aspect.
    Who produces the content though? Is it street gangs? Or is it individuals? Is it organized crime? Who produces those videos and that content?
    There are obviously a lot of opportunities to produce the content all over the world, given the ready access to those images. They are online so it is a global problem. In Quebec, we often see people who are very close to the victim such as a father, uncle or child-care provider do that kind of thing. So they are crimes of opportunity: People have access to the victim and obviously have an interest in that kind of sexuality.
    I have not seen street gangs involved in those crimes because they are more interested in procuring and that kind of thing.
    Since production of the content is not specifically organized, that also makes things more difficult. That must make your work harder because those individuals are operating on their own in a way.
    Exactly. There is no pre-established network. We have not seen any. They are really crimes of opportunity—

[English]

     The time is up, Martin.
    Just finish your answer, Lieutenant Vachon.

[Translation]

    Crimes of opportunity that are committed by parents, uncles, aunts and so on, people who are very close to the children and cause irreparable harm.

[English]

     Thank you very much.
    Niki, you have two and a half minutes, please.
    Thank you. I have just a quick question.
    Is Dr. Laidlaw still online?
    No, Dr. Laidlaw left at about six o'clock.
     Thank you for that.
    My question is to you, Ms. Todd.
    You spoke powerfully about the role of education, and I was hoping you could talk a bit more about the role that education can play in keeping children safe.
    I know that in Finland, for example, through its KiVa program, they've done a lot of good work in terms of reducing the social rewards that bullies receive by empowering youth to withhold those rewards, effectively trusting kids to make the right decisions once given the right information and by explaining to them the harmful effects of cyberbullying and explaining consent.
    What role do you think empowering youth through education should play?
     Thanks, Niki.
     I have to apologize for my passion. My passion comes out as a parent and also as an educator.
     I work in the third-largest school district in British Columbia, and my role as a digital literacy and online safety educator comes into play. When going out to schools or talking to parents.... If you think about it, who has the ears of children every single day? It's educators in the school system.
     We work in the prevention method and then the end result of a crime being committed. We talk to kids as young as four or five years old in preschool, pre-kindergarten or kindergarten, and there's a time to talk about online safety, social and emotional learning, respect and how to interact with others. That is the core, and you build upon it year after year.
     Working with a variety of teachers all over the place, we find that some teachers aren't getting it. They aren't seeing the importance. They feel that reading, writing, social studies and science are more important than teaching about digital literacy, online safety and citizenship.
     We need to ensure that this is done in the schools and that we are talking to our kids and, more importantly, to parents. Parents out there are unknowing. They're handing devices to their kids as early as seven and eight years old, and then they're complaining that this or that happened or whatever. We need to educate, and we need to bring more awareness. The question is this: How are we going to do that?
    My role as an educator and as a parent is to get that information out to those who need it. Yes, the tech industry and governments all need to be part of that, but this is multi-level. If you want to look at bubbles all around, there are multiple things that need to be done in this area.
(1855)
     Madam Chair, how long do I have left?
     You have 24 seconds.
     Okay.
     We know some important work has happened in B.C. on this front. There's the B.C. NDP government's work around the Intimate Images Protection Act.
    Ms. Todd, what lessons do you think could be learned by the federal government when it comes to B.C.?
    The B.C. government has worked really hard on this, on ensuring that there is a reporting tool someplace so that if it's not criminal, there's the civil area. If an image has been shared in public, what a targeted victim can do.... There is the tribunal, where there is counselling provided. There are mentors to help the person figure out what to do next and the steps for civil restitution. Then there are ways to get the image taken down. That is the next step in that process.
    Thank you very much. I'm sorry, Ms. Todd.
    Go ahead, Mr. Coteau.
     Considering that we have three and a half minutes left, I'd like to move to adjourn.
     Thank you.
     There's a motion to adjourn. There is no debate.
     Is there anyone who does not agree?
    Some hon. members: Agreed.
    The Chair: Good. Thank you.
     The meeting is adjourned.
Publication Explorer
Publication Explorer
ParlVU