Skip to main content

JUST Committee Meeting

Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.

For an advanced search, use Publication Search tool.

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

Previous day publication Next day publication
Skip to Document Navigation Skip to Document Content






House of Commons Emblem

Standing Committee on Justice and Human Rights


NUMBER 125 
l
1st SESSION 
l
44th PARLIAMENT 

EVIDENCE

Thursday, December 5, 2024

[Recorded by Electronic Apparatus]

(1100)

[English]

     I call the meeting to order.
     Welcome to meeting number 125 of the House of Commons Standing Committee on Justice and Human Rights.
    Pursuant to Standing Order 108(2) and the motion adopted on December 2, 2024, the committee is meeting in public to begin its study of the subject matter of Bill C-63, an act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other acts.

[Translation]

    Before welcoming our witnesses this morning, I wish to call your attention to the presence in the room of Ms. Sokmony Kong, Secretary of the Cambodian division of the Assemblée parlementaire de la Francophonie. This parliamentary official was chosen by the Association des secrétaires généraux des parlements francophones, or ASGPF, in recognition of her very highly esteemed work within her organization. Ms. Kong chose the Parliament of Canada for her two-week professional development placement.
    We wish you an excellent stay with us, Ms. Kong. As a former member-at-large representing America for the APF, I’m very pleased you chose Canada. I therefore wish you a good stay with us.
(1105)

[English]

    I would like to welcome our witnesses for the first hour. They are all appearing by video conference.
    Before I say their names, I have a few reminders.
    I'm going to ask colleagues in the room or by video conference to please wait until I recognize you by name before speaking, and to ensure you address your questions through the chair. Please do not take the floor until after you are recognized.
    For witnesses participating by video conference, please ensure you have selected, on the bottom of your screen, the language of your choice.

[Translation]

    I also want to say that all of the equipment belonging to the witnesses here with us this morning was tested and everything is working well.

[English]

    As the chair, I want to make note of the fact that it is my responsibility, with the help of the clerk, to keep time as best we can in order to allow fairness for the witnesses, and for the members in the room asking questions, and also to suspend for a minute to allow one hour for the second group of panellists to be brought in.
    I will now introduce them to you and ask each of them to give their opening remarks for up to five minutes.
    With us this morning, from the Amanda Todd Legacy Society, is Madam Carol Todd, founder and mother.

[Translation]

    We also welcome Ms. Lianna McDonald, executive director of the Canadian Centre for Child Protection.

[English]

    We also have Carl Burke and Madam Barbie Lavers, who are participating together as individuals.
    Now I will ask Madam Todd to please begin with her opening comments.
     I'm speaking to you from Vancouver, British Columbia. I thank you for this invitation to participate in this prestudy session on Bill C-63.
    To start, the majority of what I'm going to say in the next five minutes and in answer to the questions are my thoughts and my thoughts only.
    Today I must stress the importance of Bill C-63, the online harms act. This bill is a comprehensive approach to addressing the growing concerns of harmful content on the Internet. Online safety, I feel, is a shared responsibility, and everyone—users, parents, educators and platforms—plays a role in creating a safer online world by ensuring protection, accountability and support.
    My name is Carol Todd. I'm widely known as the mother of Amanda Todd. I am a teacher-educator in British Columbia with my work primarily centred on education on digital literacy, online safety and child abuse prevention, namely exploitation and sextortion. Providing children, teachers and families with the knowledge and skills to navigate the digital world is essential and is one of the reasons I created a legacy, a non-profit, in Amanda's memory.
    My daughter, Amanda Todd, was a Canadian teenager whose tragic story brought international attention to the severe impacts of cyberbullying, online harassment and exploitation. She was born in November 1996 and faced relentless harassment both online and off-line as a young teenager. She ultimately took her life in October 2012. Knowingly, parents shouldn't outlive their children in preventable situations.
    Amanda's ordeal began when she was 12 years old. She was persuaded by an online stranger to expose her breasts on a webcam. This individual saved the image and later used it to blackmail her, threatening to share the photos with her friends and family if she didn't perform more explicit acts. Despite changing schools multiple times, Amanda couldn't escape the harassment, and the blackmailer continued to follow her for two and a half years, creating fake profiles to spread the image and further humiliate her.
    In September 2012, five weeks before Amanda took her own life, Amanda posted a YouTube video entitled “My story: Struggling, bullying, suicide, self-harm”, in which she showed flash cards to share her painful experiences. She detailed the bullying, physical assaults and severe emotional distress that she endured both online and off-line. The video went viral after her death, and currently it's been viewed about 50 million times across the world.
    Amanda's death prompted significant public and governmental responses. In 2022, Aydin Coban, a Dutch man, was convicted of harassing and extorting Amanda in a Canadian court and sentenced to 13 years in prison. He is currently serving his Canadian time in the Netherlands.
    Amanda's story continues to resonate, highlighting the urgent need for stronger protections against online harassment and better supports for victims of bullying, cyber-bullying and exploitation.
    There are so many voices that remain unheard due to fear, judgment or shame, or because they can no longer speak. It is vital to let these silent voices be heard and to create a more compassionate and understanding world, where we help and not hurt.
    Over the past decade, we have observed rapid changes in technology. We have watched devices that were a useful tool for communication turn into fun devices that can exploit and hurt others. Since its inception, the Internet has taken on darker tones. The word “algorithms” is now in our vocabulary, where it once never was.
    Research has highlighted some of the harmful effects related to screen time. These effects include reduced well-being, mood disorders, depression and anxiety. These effects impact children and adults alike in a world filled with online media.
     With increased access to the Internet comes easier access to violent and explicit online content that can impact sexual attitudes and behaviours, harm to children through the creation, sharing and viewing of sexual abuse material, and increased violence against women and girls, as well as sex trafficking.
    Governments must take action to enact new laws and modify existing ones.
(1110)
     To make the online world safer, we must increase education and awareness. We must have stronger regulations and laws, like Bill C-63. We have to improve the behaviours of the online platforms. We need parental controls and monitoring, and we need to encourage reporting like Cybertip.ca.
    Bill C-63
    Ms. Todd, we will come back to you with questions, because I think our members will probably want to flesh those out as well.
    Okay.
     I neglected to ask.... I know that when people are on the screen they're not paying attention, but I'll do my best to show that there are 30 seconds before time is up, so as not to interrupt.
     I will now move on to Madam McDonald, please, for up to five minutes.
    Thank you very much to the committee for this opportunity.
    My name is Lianna McDonald, and I am the executive director of the Canadian Centre for Child Protection, a registered charity that has been operating for nearly 40 years to protect Canadian children.
    For the past 22 years, we have been operating Cybertip.ca, Canada's national tip line to report online crimes against children. In 2017, we launched Project Arachnid, an innovative online platform that targets the removal of child sexual abuse material at scale. It is through this critical work that we have witnessed first-hand and all too often the colossal injury and harm that happen every single day to children online. The unregulated Internet has basically destroyed childhood as we have historically known it, while children and families are paying a devastating price for the ongoing failure of government to regulate online spaces.
    There has been a steady increase in the number and seriousness of online crimes against children since the rise of social media, which created a perfect storm of injury and harm to children. We saw another huge and significant jump after 2020 with COVID. These are key events that have exacerbated and intensified harm to our children.
    We've handed children technology that has been weaponized against them by predators and technology services, and I'll underscore what I'm talking about. Every month, through Cybertip.ca, the tip line, we process over 2,500 reports, and these are reports by Canadians who know to come in to us. We've seen a 760% increase in luring reports since the start of COVID in 2020. We've managed more than 4,300 requests from youth and their caregivers in the last year alone. We receive approximately seven sextortion reports every single day at the tip line, and we've processed close to 4,000 sexually explicit deepfake images and videos of children. Finally, since 2017, we've issued over 40 million takedown notices to companies, to get them to take down child sexual abuse material.
    There is no other entity in Canada that is doing the work we are doing. Regrettably and in a very difficult way, we are witnessing first-hand the scale of harm that is happening to our children and how it has evolved over the years. We are dealing with young people who are terrified about what an offender will force them to do next, youth who are frantically trying to get their child sexual material down, families who are dealing with situations that have escalated well beyond anything—anything—that they ever could have imagined.
    We are supporting survivors of child sexual abuse material from all over the world. Abusive imagery of them is endlessly uploaded and re-uploaded on platforms available to anyone with an Internet connection. These victims and children have been stripped of their privacy and their dignity, and, in fact, they have no recourse. Their rights are repeatedly violated while the predators who obsess over them, the ones who stalk, harass and target them, are shielded by the cloak of anonymity that technology affords them.
    To try to deal with this mess through Project Arachnid, we are issuing between 10,000 and 20,000 notices to companies every single day. These notices are overwhelmingly for known child sexual abuse material. By that, I mean imagery that has been circulating for years, tormenting survivors, yet still these platforms get to choose whether or not they take it down. They get to regulate themselves. They get to decide all on their own what is okay and what is not okay. It's outrageous, and it must change.
    To put this all into perspective for you quickly as I close, I'll give you a sampling of the actual interventions that our organization deals with every day. Imagine—and this is happening—a young girl between the ages of 11 and 12 who is being tortured daily by a group of anonymous men. Every day, she is ordered to go into the school bathroom and is instructed to self-abuse and harm while recording the material. She is paralyzed by fear. She does as she's told. The requests get worse, more degrading, more harmful. Eventually, she reaches out for help.
    Imagine that a teenage boy is tricked into sending a sexual image to a person he thought was a peer, but that person turns on him and threatens to send the image to all his friends and family. He is shocked. He is terrified. He believes with every bone in his body that they will do what they've said. He pays them. It's not enough. The threats keep coming. He is desperate to make it stop.
(1115)
     These are just a few of the examples that we hear. They are not hypotheticals—
    Thank you, Ms. McDonald.
    We'll get back to you with questions. Thank you.
    Thank you.
    I'm now going to ask you, Madam Lavers, to please commence your five minutes.
    Thank you.
     Good morning. Thank you for inviting my husband and me to speak today.
    We want to introduce our son to you today. Harry was a very outgoing and inclusive young man. He was intelligent and handsome. He was an athlete and a brother, and he was loved by his friends and his community.
    Harry was a patriot. He loved his country. He joined the cadets at age 14. Then in grade 11, in fall 2022, Harry joined the Prince Edward Island Regiment. He was 16. He was doing his basic training in Summerside, Prince Edward Island, on the weekends, while going to Souris Regional School full time. He only had one weekend left to complete his basic training for the RCAC. He was so proud of Canada, and he planned to dedicate his life to serving his country.
    I'm Barbie Lavers. My husband is Carl Burke. We are Harry's parents. Harry was 17 years old when we lost him to sextortion. As a family, we had many conversations with Harry and his sister Ella about safe online use and about the dangers of sharing images online. Unfortunately, our family was not aware of the word “sextortion”. We had never heard of it.
    On April 24, Harry came to his dad and told him that he had screwed up. He had shared intimate pictures with a girl, supposedly his own age, from Nova Scotia. This individual was now demanding money, or they would share Harry's images with all of his contacts, and in particular with his commanding officer in the RCAC. Sadly, this individual did share some of the images with his friends in cadets, and Harry knew this. I was also contacted on Instagram by apparently the same individual, who told me they would ruin his life.
    When Harry came to us that evening and told us what had happened, all four of us sat at the table, talked about it and made a plan to contact the local RCMP in the morning. We thought Harry was comfortable with this plan, but sadly, he wasn't.
    On the morning of April 25, we were getting ready for our day. My husband went down to check on Harry. The sheets in his bed had been pulled back, but the bed was not slept in. He yelled to me, “Where is Harry?” I came running down the stairs. By this time, Carl was in the garage. He found Harry face down on the floor. He shot himself.
    What I'm telling you here does not define or demonstrate, in any way, what we found, what we felt or how our family felt, or how our lives have been changed forever.
    Just two weeks ago, two teen boys and a young man in P.E.I. were targeted for the under-reported global crime of sextortion. The boys were targeted on social media platforms, where the strangers posed as age-appropriate girls for sex photo swaps. This has to be stopped.
     We as a family support Bill C-63 to protect our children. As advancements continue with technology and as access to devices continues, the risks to our children increase. We must work together as communities, as families and as governments, through user regulations and accountability, to reduce the online abuse of our children and to provide support to all of us.
    Social media platforms must be held accountable. They must incorporate regulations to keep our children safe. Children like our Harry are dying. The evidence of harm to our children is abundantly apparent.
    Our 17-year-old daughter Ella has a Facebook account. She is unable to access Marketplace on Facebook because she is under 18. If you or I were on Marketplace, occasionally you might get a pop-up that says a seller might not be from your country. Obviously, Facebook has the ability to review IP addresses from incoming messages to their system. Can we not use this for our children's safety?
    Now is not the time to enact or to dramatize politics. Colours need not matter in this discussion. Our children are the most important issue here, not colours. This bill provides an opportunity to protect our children and to show political coalition. Our children are in crisis. Some could even say they're at war. It is not time for our children to be used as political pawns to show that one party is more correct than the other. A temporary alliance must be, and is, required to save our children.
(1120)
     The longer Bill C-63 remains a political issue, the more children we will lose. We beg you to please stop wasting time and do something to help save our children.
    Thank you.
    Our children are—
    Thank you, Mrs. Lavers.
    What we do now is a first round with four parliamentarians. I give them six minutes each for questions and responses from each of you.
    We will commence with Ms. Rempel Garner.
    Thank you to all of the witnesses for their courage and advocacy.
    I'll start by saying that this issue is one that has been near and dear to my heart for many years. I've had people very close to me have their lives completely upended and upset by a lack of action on, and tools for preventing, online harassment.
    My first question is this: Do you share the sentiment with me that we shouldn't be waiting two or three years to have action to protect Canadians online, particularly children?
    I'll start with Ms. Todd. It's just a yes or no.
     No, we shouldn't be waiting.
    We shouldn't be waiting.
    Do you also share with me a concern that online platforms have been able to dictate the terms of this discussion and to wiggle through very tough regulatory restrictions that could have prevented a lot of what we heard here today? Do you get the sentiment that, sometimes, these platforms have the upper hand?
    Ms. Todd.
     Yes. These platforms are money-making businesses, and they haven't thought about the online safety of children and adults. We know that because, in the 12 years since I've been focusing on this, we've seen changes. Just recently, six months ago, Meta changed the algorithms for safety.
(1125)
    Thank you.
     They could have done this before.
    I'm with you on that.
    Based on your two answers to that, would you recommend that we say responsibility for setting out rules for these platforms should be punted down the road two years and put into a regulatory body that has the ability...? These platforms would have the ability, behind closed doors and without public scrutiny, to set what those rules are. That doesn't sound like a good approach to me.
    What do you think, Ms. Todd?
     I read up on what Australia has done with an e-safety commissioner. I believe having a regulatory board like an e-safety commission would be a good idea. Unfortunately, it takes a long time to develop. However, down the road—
    What if we could have our cake and eat it, too? What if Parliament, today, legislated a duty of care for online platforms and said, “You have to do this. This is illegal,” and then sent that to a regulator? Starting today, what if there was a legislated duty of care for online operators to do everything that every other witness talked about here today? Do you think this is a better approach?
    If you could describe to me what that duty of care is and who the regulator might be, it's a possibility. Everything has to be laid out, and it has to—
    I'm sorry, folks. I need to interrupt for a moment.
    I'm told by members that the bells are ringing. I'm going to have to ask whether we can have unanimous consent to continue. I'm assuming that, if the bells are ringing, we have 30 minutes, but I could be wrong.
    Can somebody please check?
    Is it a quorum call, or is it—
    The clerk is checking.
    The bells are ringing.
    Are they ringing because the vote is in 10 minutes, or are they ringing...?
    It appears we have 30 minutes.
     That could change.
    Yes, let's keep going.
    I'm asking for unanimous consent. I think everybody is agreeing that we will continue to go on with this, but we have to stop five minutes before the vote.
    Thank you. Please continue.
    I'm sorry, Ms. Todd. It was your turn to respond.
     Thank you.
    Ms. Todd, you just asked me if I could show you that approach, and I can.
    There's a bill in front of Parliament right now, called Bill C-412. It outlines a specific duty of care for online operators that says exactly what they have to do in this. It also specifies the regulatory body. If it was passed today, it could be enacted today, and we could have immediate impacts.
    That's my concern with Bill C-63. It takes this responsibility and puts it into a regulator that hasn't been built. It also gives online platforms the ability to wiggle out of this two, three or four years in the future. My concern is with regard to how many more kids are going to experience this and have detrimental impacts.
    Therefore, I would direct your attention to Bill C-412. However, with the time I have left, I'd like to just ask some questions on whether you think some high-level things that are in there would be a good approach. First of all is the immediate updating of Canada's non-consensual distribution of intimate image laws to include images created by artificial intelligence, otherwise known as deep nudes.
    Do you think we need to do that today, Ms. Todd?
    Yes.
    Do you realize that Bill C-63 would not do that?
     It was my understanding that there was embedded, in Bill C-63, something about AI, but—
    It doesn't. It punts it to a regulator, and it doesn't update the Criminal Code.
    Bill C-412 does.
    Do you also think that the duty of care, which is in Bill C-412 and includes things like other witnesses said about being able to discern who is a minor and having specific actions on what online operators have to do to prevent the instances, is something we should be looking at today?
    Yes, I do.
    Just to be clear, your position is that we should not be necessarily punting this to a regulator and having action two to three years in the future, but that we should be looking for action today through a prescribed duty of care.
    Give a quick response if there is one.
(1130)
     As I said, we realize that creating a digital safety commission in Canada takes a while. My dream would be that, in the meantime, while this is built, there would be something in place that could regulate also.
    Thank you, Ms. Todd.
    Let me now move...and I'm going to try to be fair. If I do the math, I can go six minutes, six minutes and six minutes, and that will give me enough time, then, to stop before the bells.
    Mr. Maloney, you have up to six minutes.
    Thank you.
    Thank you, Madam Chair. I want to thank you and the members around the table for allowing us to do this study, particularly those who voted in favour of proceeding with it.
    As well, I want to thank the witnesses for their powerful and important presentations today.
    I just want to highlight, so that everybody knows, and I think everybody is aware, that the minister announced yesterday that we intend to split Bill C-63 into two parts, with the digital safety and child protection measures separated from the measures that focus on hate. I'd like to get on record that we've agreed to start with a prestudy of three meetings, but I believe that we should continue with three to six meetings on part 1 of the bill. This means a focus on the online harms act and the amendments to the mandatory reporting act. Then we can proceed with a second study, on the balance of the bill, at a later date.
    I do have questions for the witnesses. I just want to emphasize our gratitude to all of you for being here, because we know it is incredibly difficult to share your stories in this fashion or in any other fashion. You have our gratitude and respect.
    Child sexual abuse in Canada is currently illegal. Law enforcement can and should deal with horrible content, as Ms. Rempel was saying. However, as you said, Ms. Lavers, we need to depoliticize this, and the Criminal Code amendments alone are not enough.
    What Bill C-63 would do.... I'll just be clear: A number of the issues that Ms. Rempel Garner was referring to are included in Bill C-63, so I think people need to understand that.
    My question to all of you is this: If we were to proceed with just the Criminal Code measures alone, without the digital safety framework, would that be enough to address the problems we're talking about today, in your opinion? I put the question to all of you.
     Can I jump in on that one, please?
    It is really important that we underscore this point: While criminal law is absolutely essential, it is not enough when we're dealing with the upload of material and content online. What we know is that, of course, we need to be able to convict and charge people, to move down that process, but by the time an image gets uploaded, it can be distributed to multiple platforms at a rate that is just unmanageable. The fact that we have some of the strongest legislation in the world tied to even how we define child sexual abuse material and are still issuing between 10,000 and 20,000 notices per day really shows that gap. While, certainly, as mentioned, those things are important, we know that when we're dealing with online content and harmful content, we cannot rely exclusively on criminal law.
     Do any of the other witnesses want to comment on that before I move on?
     No. Okay.
    Ms. McDonald, I'll go back to you then.
    The 10,000 to 20,000 number that you mentioned a couple of times is quite stark. Without the takedown provisions that are part of Bill C-63.... Let me put it another way. With the takedown provisions that are included in Bill C-63, how would the outcomes be different? What would the time frame difference look like, in your opinion, based on the companies having free reign to make the decision now, versus the provisions of Bill C-63?
    I would say a few things there.
    First off, I know this will sound odd, but we feel that the 24-hour timeline is still too long. We know this because we can see companies removing material within minutes.
    The fact of the matter is that right now there is no accountability; there is no oversight, so we are allowing companies to decide whether or not they are going to agree, whether they want to comply, whether they're going to ignore. There's no accountability. Obviously, we need regulation. Obviously, we need an accountability structure. There need to be consequences for these companies.
     I will also note the level of recidivism that we see. By that I mean that when we have issued a notice to a company about a particular image, we'll see that same image reappear on that same platform. There's absolutely no excuse for that. There is no incentive, and there's no accountability. This is why they do what they want.
(1135)
    I'm curious. If there are 10,000 to 20,000 notices, what's the response rate? What percentage of the notices you send actually generate a response, and how many of those responses generate a response that you're satisfied with?
    Actually, over the course of the few years we've been doing this—and I guess it's a little bit because of the reputation of the organization and what we do—we're seeing about 75% of the material coming down within two days. I think that's roughly correct. However, we're also seeing other providers and bad actors that, in fact, just ignore our notices. They just decide that they're not going to respond.
    Again, this is where we need to see governments around the world uniting and calling to account these digital titans that have been allowed to do what they want without any consequence.
     I believe my time is nearly up, so I'll just say thank you again to you and to all the other witnesses for being here today.
     Thank you, MP Maloney.

[Translation]

    Mr. Fortin, I give you the floor for six minutes.
    Thank you, Madam Chair.
    I thank all the witnesses for being here today.
    Ms. Todd and Ms. Lavers, your testimony is troubling. Even if we know about it, we don’t realize the effect this type of situation can have; we don’t realize the impact social media now has on our children. I consider it our duty as legislators to look into the issue and find the most appropriate solutions.
    I know it is difficult for you to tell these stories. I don’t know how to describe it all, but it is the saddest of tragedies. Thank you for having the courage to testify before the committee.
    That said, as her speaking time was up, I believe Ms. Todd was unable to finish her statement.
    Ms. Todd, I would like to give you a minute or two to finish your testimony, if you please.

[English]

     Thank you for allowing me to do this.
    I will continue about why I feel that Bill C-63 is important.
    I also want to say that we aren't the only country that has afforded this. The U.K. has an Online Safety Act that was established and written into law in 2023, and Australia had the Online Safety Act put into law in 2021. Also, the EU has an online harms act that is similar to what Canada is doing. Canada has been in collaboration with the U.K., Australia and the EU regarding BillC-63.
    Why is this important? It's important because it protects children. What I don't understand—and this is from my own thinking—are all the people who are negative on Bill C-63, saying that it's not about children and it's not about protection. They focus on the parts that Minister Virani has said he and his cabinet would rewrite. It is about protecting children. It's about protecting children and families from the online behaviours of others.
    We can't do this without the tech companies' help. It's really important that we understand this. There are so many people who don't understand this. I read the negative comments, and, personally, it just infuriates me, because my daughter died 12 years ago, and I've waited 12 years for this to happen. Parliamentarians and political groups are arguing about this not being necessary, and we're going.... It just hurts me. It hurts me as a Canadian.
    We need accountability and transparency. We need to support the victims. Passing Bill C-63 is not just about regulation; it's about taking a stand for the safety and dignity of all Canadians. This about ensuring that our digital spaces are as safe and respectful as our physical ones.
    By supporting this bill, we are committing to a future in which the Internet is a place of opportunity and connection, free from threats of harm and exploitation. Passing Bill C-63 would demonstrate the federal government's commitment to adapting to the digital age and ensuring that the Internet remains a safe space for all users. It balances the need for free expression with the imperative to protect individuals from harm, making it a necessary and timely piece of legislation.
    It's also essential to recognize the collective effort in creating platforms that address the challenges faced by children, women and men.
    We've come to realize that what happened to Amanda could happen to anyone. As Amanda herself said, “Everyone has a story.” When these stories emerge, and they belong to your child, your relatives or your grandchildren, they carry more weight.
    No one is immune to becoming a statistic, and, as I have previously shared, I have waited 12 years for this, because on day one of Amanda's death, I knew things needed to change in terms of law, legislation and online safety. I can't bring my child back, but we can certainly keep other children safe.
    Thank you for this time.
(1140)

[Translation]

    Thank you, Ms. Todd.
    Ms. Lavers, I offer you the last minute remaining for you to add your comments, should you have any.

[English]

    Thank you.
     I think what I would like to say is that our children are so precious, and I would ask you as a committee to go home and hug your children, your nieces, your nephews and your grandchildren and just think about what Carol and I and so many other parents have had to endure because of unsafe social media platforms. Just take that home with you and really think about it, because Harry and Amanda could still be here with us if this conversation were not necessary.

[Translation]

    I thank you all. The members of our committee will think a lot about Harry and Amanda.
    Thank you, Madam Chair.
    Thank you, Mr. Fortin.

[English]

    Now we will conclude with Mr. MacGregor, please.
     Thank you very much, Madam Chair.
    I'd like to echo colleagues in thanking the witnesses for joining our committee and helping us wade through a very difficult subject. I'm a father of three daughters. I have 12-year-old twins, so we are dealing with that as parents, with them getting access to the Internet, and the challenges of finding ways to allow them to do that safely.
    Ms. Todd and Ms. Lavers, I'd like to start with you, because part of the debate on the subject of Bill C-63 has been on whether we should just modernize existing laws and changes to the Criminal Code or whether we should add another layer of bureaucracy.
    Briefly, when you had your experiences in reporting this to the police and when the police were trying to make use of existing Criminal Code provisions to solve this for your children, can you talk about some of the limitations you experienced with that and illustrate why you think more is needed based on your personal experiences?
     Do you want to go first, Barbie?
    Sure. Thanks, Carol.
    In our experience, the RCMP worked with the FBI in the United States, but tracking down the IP address of who had contacted Harry was difficult. When they did track it down, it was basically like a call centre type of set-up, and people worked there to extort and sextort. This is a job, just as if they were working at Bell Aliant and taking calls, but they're calling out, and they search for people.
    I don't think that just having the Criminal Code is enough, as Lianna said. I think there have to be stronger guidelines and regulations in order to hold these companies accountable, because they could do it now if they wanted; they have the ability. I have no doubt in my mind that they do, but they don't want to do it, because they use the algorithms that they have to make money and not to keep people safe.
(1145)
    Thank you.
     I just want to be sure Ms. Todd can get in, because I also want to direct a question to Ms. McDonald.
    Please go ahead, Ms. Todd.
     My thinking as a teacher-educator—and I speak to parents, teachers and communities—is that there's an aspect of prevention, intervention and reaction, and legislation becomes a reactionary phase: “Something's happened, and what are we going to do next?” We need more prevention and intervention.
     When I first had to report when this was happening to Amanda, and I reported it to our local RCMP, it was a very challenging and difficult situation. You have to remember that all this started 14 years ago, two years prior to her death. It came back to me that they couldn't find the IP address coming out of the States. It was under a VPN, and they couldn't find anything. This was when she was alive.
    After she died, through an investigation in the Netherlands and the U.K., they found an IP address for a fellow who was victimizing other young girls, and this happened to be Amanda's predator. Through finding information on Facebook, Amanda's name popped up under the account that she had. Ultimately, the Dutch police contacted the Canadian RCMP, and that's how Amanda's predator got caught.
     Things have changed in the last 12 years, and I understand that, but there needs to be more incentive for law enforcement to take on these cases.
     Thank you.
     Not all cases will go to court.
    Thank you very much for that, and just because I'm running out of time, I would like to get to you, Ms. McDonald.
    On your website, your organization has a statement that “exclusion of private messaging features...leaves a substantial threat to children unaddressed.”
    I'm curious about how we approach this, because, of course, there are great privacy concerns in place now. My 12-year-olds are using children's messenger, so we have full control over their contact list, and, in fact, the parents of their friends also have full control, so we have a lot of oversight.
    In what ways would you like the law to be crafted to address what you think is a glaring omission in this bill while still respecting the very real privacy concerns that have been raised with the potential of such an approach?
     Give a brief response if possible. There are 40 seconds left.
    Just to make the point clear, yes, we are concerned that private messaging is not brought in scope. I think the concern is that we see many of these organized crime groups targeting Canadian children in their own homes and bedrooms and basically moving over to these types of applications.
    Our organization has produced a paper on our site that outlines how we can capture some of the metadata without capturing the direct communication. There are a number of ways and opportunities for us to build that in. Certainly that is something that we will continue to raise.
    Thank you very much.
    Thank you to our witnesses. Normally we would go for another few minutes but, unfortunately, the bells are ringing, and it's probably going to take us another 30 minutes before we start.
    We're going to suspend, and we're going to test the second panellists while we're suspended. Then we'll come back to our second panellist session.
     Why don't we continue with this panel after the...? I mean, it's....
     It's up to you.
    There are 17 or 18 minutes.
     I'm going to suspend for now, though.
(1145)

(1205)
     I am now going to start the process.
    Ms. Haugen, you will get a phone call from the clerk or somebody from the room regarding interpretation, if you don't mind answering that.
    I will welcome our witnesses. We have two witnesses by video conference and one in the room.
    We have Madam Frances Haugen, advocate, social platforms transparency and accountability; and we have Madam Miranda Jordan-Smith, executive, both by video conference.
    With us in the room, from Coalition pour la surveillance internationale des libertés civiles, we have Mr. Tim McSorley, national coordinator.
Please wait, each of you, until I recognize you by name before speaking.
For those participating by video conference, please ensure that you have selected, on the bottom of your screen, the language of your choice, because questions will be coming in both languages.
I also ask that you wait to be asked to speak, whether you're a member or a witness, and that you go through the chair.
I will now ask Madam Miranda Jordan-Smith to please commence.
You have up to five minutes.
     Thank you for having me here today.
    As mentioned, my name is Miranda. I'm here today to represent the astronomical and increasing number of victims who have been subjected to online harm. Please allow me to share with you the story of my daughter's abuse.
    At the age of 12, my daughter had a cellphone, which we ensured was equipped with parental controls. She was not on social media at all. Her screen time was limited, and her contacts needed to be approved. Her father could see all of the activity on her phone.
    Therefore, it was shocking to us to learn that our daughter, at the age of 12, could be groomed and manipulated online on a school device that carried a music platform that did not have any age restrictions. It had a chat function, like many, and it was not monitored adequately by the tech provider to detect the online predator she was speaking to. For one year, she was groomed by an online predator, who presented as a peer.
    In June 2022, at the age of 13, she was abducted right beside her school by the predator, a 40-year-old man. When my daughter did not arrive home on the school bus, I reported her as missing.
    From there, a full-scale search for her ensued, with volunteer crews on the ground, knocking on doors and putting up posters. The police in Edmonton merged their historical crimes, missing persons, cybercrime and human trafficking divisions in the hope that our daughter would be found safe.
    For days, we had sleepless and tearful nights, wondering what happened to her. We engaged the media heavily, and our appeals made international news, with the New York Post and the U.K.'s Guardian.
    After a week of our daughter missing, I woke to officers at our door, knowing that they had an update. We knew that either they had found her alive or our daughter would be returned to us in a body bag.
    Naturally, we were overjoyed to learn that our daughter was found. The FBI had seized her from a hotel room in Portland, Oregon, and she was being held at a children's hospital there, where they administered a rape kit and an assessment of her abuse. Immediately, we jumped on a plane to retrieve her from Portland, and we brought her home.
    While the criminal case is still pending, with a federal trial date set for January 13, 2025, the abuse that my daughter suffered is unbearable, impossible to comprehend. Her perpetrator faces 70 to 77 years in prison for a litany of crimes, some of which include kidnapping, rape, sodomy, putting a child on display, possessing and developing child pornography, and crossing an international border with sexual intent.
    My daughter was stuffed into the perpetrator's trunk, and this act alone could have killed her.
    For the last two years, my family has been on a healing journey. The pain and the damage of these horrific events is complex and largely irreparable. We are learning to coexist with it.
    Today I appeal to you to understand the damage of an unregulated Internet and what it creates. Tech companies need to be held accountable and ensure they are acting in a legal and ethical manner. The online harms bill is a step in the right direction.
    While I know that some people feel regulation is an infringement on one's freedom of speech or privacy, I must tell you that my family has no privacy and no anonymity. Everyone knows who we are now, and we have to live with judgment or misconceptions around, “This could not happen to my child,” or that our daughter is somehow gullible, or that she comes from a poor socio-economic background, all of which are not true.
     I often think about regulation. To drive a car, one needs a licence. To fish or hunt, one needs a licence. To go into a porn shop and access pornographic material, one must produce identification. Why is the Internet not regulated the same way, so that users have to verify who they are?
    I think it's time for online reform in Canada, otherwise more children will become victims. The impact is great for families and communities across the country. Already, the U.K. has progressive legislation, and Australia just passed regulation that social media users must be 16.
    I appeal to you today, as members of Parliament, to make changes that will have a profound and lasting impact for the citizens of Canada, because it is my position and my lived experience that no child is safe on the web. If this can happen to us, it could happen to anyone.
    Thank you.
(1210)
     Thank you very much.
    Mr. McSorley, you have up to five minutes, please.
     Thank you to the committee for this invitation to speak to Bill C-63.
     I'm grateful to be here on behalf of the International Civil Liberties Monitoring Group, a coalition of 44 Canadian civil society organizations that work to defend civil liberties in the context of national security and anti-terrorism measures.
    The provisions of this bill, particularly in regard to part 1 of the online harms act, are vastly improved over the government's original 2021 proposal, and we believe that it will respond to urgent and important issues. However, there are still areas of serious concern that must be addressed, especially regarding undue restrictions on free expression and infringement on privacy.
    This includes, in part 1 of the act, first, the overly broad definition of the harm of “content that incites violent extremism or terrorism” will lead to overmoderation and censorship. Further, given the inclusion of the online harm of “content that incites violence”, it is redundant and unnecessary.
    Second, the definition of “content that incites violence” itself is overly broad and will lead to content advocating protest to be made inaccessible on social media platforms.
    Third, the act fails to prevent platforms from proactively monitoring, essentially surveilling, all content uploaded to their sites.
    Fourth, a lack of clarity in the definition of what is considered “a regulated service” could lead to platforms being required to break encryption tools that provide privacy and security online.
    Fifth, proposed requirements for platforms to retain certain kinds of data could lead to the unwarranted collection and retention of the private information of social media users.
     Finally, seventh, there has been little consideration on how this law will inhibit the access of Canadians and people in Canada to content shared by people in other countries.
     Briefly, on part 2 of the act, this section amends Canada's existing hate-crime offences and creates a new stand-alone hate crime offence, and it is only tangentially related to part 1. It has raised serious concerns among human rights and civil liberties advocates in regard to the breadth of the offences and the associated penalties. We've called for parts 2 and 3 to be split from part 1 in order to be considered separately, and we're very pleased to see the government's announcement yesterday that it intends to do just that.
     I'd be happy to speak to any of these issues during questions, and I've submitted a more detailed brief to the committee with specific amendments on these issues. However, I'd like to try to focus in the time I have on the first two points that I've made regarding “content that incites violent extremism or terrorism”, as well as a definition of “content that incites violence”.
     The harm of “content that incites violent extremism or terrorism” is problematic for three reasons and should be removed from the act. First, it is redundant and unnecessary. The definitions of “content that incites violent extremism or terrorism” and “content that incites violence” are nearly identical, the major difference being that the first includes a motivating factor for the violence it is attempting to prevent. These two forms of harms are also treated the same throughout the online harms act, including requirements for platforms to retain information related to these harms for a year to aid in possible investigations.
    Moreover, and maybe most importantly, incitement to violence alone would clearly capture any incitement to violence that arises from terrorist or extremist content. Further definition of what motivates the incitement to violence is unnecessary.
    Second, if included, incitement to terrorism will result in the unjustified censorship of user content. “Terrorism”, and with it “extremism”, are subjective terms based on interpretation of the motivations for a certain act. The same opinion expressed in one context may be viewed as support for terrorism and therefore violent, while, in another, it may be viewed as legitimate and legally protected political speech.
    Acts of dissent become stigmatized and criminalized not because of the acts themselves but because of the alleged motivation behind the acts. As we have seen, this leads to unacceptable incidents of racial, religious and political profiling in pursuit of fighting terrorism.
    Studies have also extensively documented how social media platforms already overmoderate content that expresses dissenting views under the auspices of removing “terrorist content”. The result is that, by including terrorism as a motivating factor for posts that incite violence, the act will be biased against language that is not, in fact, urging violence but is seen as doing so because of personal or societal views of what is considered terrorism or extremism.
     I note also that “extremism” is not defined in Canadian law. This ties into the third key part that we're concerned about, and that's that parts of the language used in this definition are undefined in Canadian law or the Criminal Code. This contradicts the government's main justification for all seven harms—that they align with the Criminal Code and do not expand existing offences.
(1215)
    You have 30 seconds.
    Okay—I'll finish up here.
     There are several examples, but one I'll share is that the definition of “incitement to terrorism”, in fact, includes only “actively encourages” an act. There's no definition of what “actively encourages” includes. It's a much lower threshold than “incitement”, and, in fact, goes against what is already in the Criminal Code, which punishes either “instructing” or “counselling”—
     Thank you very much. We'll get back to you with the questions.
     Thank you.
    Now we have Madam Haugen for up to five minutes, please.
    Hello. Thank you for inviting me. I was invited only yesterday, so unfortunately you'll have to listen to my stream of consciousness.
    I want to address three main issues. The first is with regard to what can be known and what is unknown. You've probably heard lots of things in the media around what the damages and risks are to kids. I thought the testimony regarding Miranda's child was incredibly compelling. There are many, many stories like this.
     Ms. Haugen, please speak closer to your microphone. Apparently, they're not able to hear you well.
(1220)
    Everything was fine on the audio check yesterday.
    Is this any better?
    Yes. Please continue.
     Good. I'm sorry. I was invited only yesterday, and none of the headphones on the approved list were available in Puerto Rico.
    You can go and read all these horrific accounts of the impacts on kids, but the thing I want to emphasize is what can be known and what cannot be known today. There are lags in the impact of when we see these effects. We look at 16-year-olds today and we say that we know what the harms are of social media, but the 16- and 17-year-olds today came online at 12 and 13.

[Translation]

    Madam Chair, there is no interpretation.

[English]

    Okay. I will suspend for a moment while we try to figure something out in the room.
    Panellists, give me a moment, please.
    I'm sorry. I was using the exact same set-up a couple of days ago on CNN, and it was fine, so I don't really know what to change.
    Is that a little better?
    Ms. Haugen, they will have to give you a call and reschedule you to attend at a different time. We have members in the room who do not speak English. They require interpretation, and that is absolutely—
    The point is not that people don't speak English. The point is that Parliament is bilingual. We work in both official languages.

[Translation]

    Exactly. We must speak in both official languages. Unfortunately, interpretation cannot be offered because of the equipment you have.

[English]

    They will give you a call and reschedule you for another time that is convenient. If you would like to stay and listen, that's okay, but you don't have to. I apologize for that.
    We will continue with the panellists we have in the time we have left.
    Madam Rempel Garner, you have six minutes, please.
     Thank you, Chair.
    Ms. Jordan-Smith, thank you for your testimony today and for your courage in speaking out on this issue.
    One thing that struck me about your testimony was that you talked about how your daughter was victimized through a platform that you weren't even aware she was using. It strikes me that in order to have a duty of care that would address the fact that technology changes all the time—there will always be some new platform that kids are on—we need to have a very clear but also broad definition of who, or what, a duty of care would apply to. It can't just be Meta or a couple of the known players, can it?
     I've been giving some thought to what that could mean. I tend towards having a broader term. The term I would like to use is something like “online operator”, which would mean the owner or operator of a platform, such as a service online or application that connects to the Internet, or that is used or that reasonably could be expected to be used by a minor, including a social media service and an online video gaming service, so that it's very clear that as new platforms come up in the future, as technology changes, you as a parent aren't having to guess whether or not your child is being exposed to a platform that might not be covered by the law.
    Would you support that type of recommendation?
    I would support it, because I think it's the best way to capture it. It would encapsulate all types of online activity, and I think that's what is important.
     Thank you.
    Then, bridging from the who or the what to what they're responsible for, I'd like to very briefly suggest some things that online platforms or operators should be responsible for: a significant duty of care to prevent physical harm or incitement of such harm, including online bullying and harassment; online sexual violence against a minor, including any conduct directed at a minor online; the creation or dissemination of imagery that is sexually exploitative, humiliates them, is harmful to their dignity or invades their privacy; the promotion and marketing of products or services that are currently unlawful for minors; and patterns that indicate or encourage addiction-like behaviour.
    Would you say we're on the right track there in terms of looking at the scope of things an online operator would have to ensure that minors were not subjected to?
(1225)
     I think, minimally, that would be ideal.
    A few additional thoughts of mine would be that they, again, have age restrictions, that there is a responsibility on tech companies to identify who their users should be. My daughter was on a platform that didn't have any age restrictions, so to me, that's completely irresponsible.
     I'm glad you brought this up, because it was actually my next question. It's a question between you and Mr. McSorley.
    The government, in Bill C-63, has not thought about age verification at all. It's punting this to a regulator that's not created, and it's going to be two or three years down the road.
    Witnesses on the other panel have suggested that age verification can be done right now through algorithms, and I agree with that. You can detect someone's age using an algorithm. If Meta knows somebody wants to buy a KitchenAid spatula, it knows how old they are.
    I'm wondering, between the two of you, if the way that we should be squaring the circle on age verification to protect personal information, while also ensuring that minors are not subjected to harm, is by requiring online operators to use algorithms or other technological means to determine age within a degree of accuracy.
    Does that make sense to you, Ms. Jordan-Smith?
    I think so. I think they need to determine an appropriate age for the users of their platform.
    Then, verification to me seems like a normal thing that's even happening online, where some providers are already self-regulating. As an example, for LinkedIn, in order to be verified, you have to upload your driver's licence. For me to take a course at Oxford, I had to upload my passport for them to verify my identity, that I'm actually the person taking that course. I don't see it as a huge deal.
     I'd just like to get to Mr. McSorley quickly on that.
     Does that seem like a way to square the circle?
     I think it would be.
     One thing that we've raised in our brief, and I think others will raise, is that there's a lack of requirements for algorithmic transparency from social media platforms in the bill. If that were integrated, I think that would answer lots of the concerns.
    Now, do both of you think this should be something that is put in a legislated duty of care as opposed to being punted off to a regulator where parliamentarians would have no say—or you would have no say—in what that looked like? Does that make sense to you, Mr. McSorley?
    I'm not enough of an expert on where the technology is at right now to say whether or not it should be legislated or what the best approach would be.
    What about you, Ms. Jordan-Smith? Would you like to see that requirement put forward in a legislated duty of care as quickly as possible, or would you like it to be punted to a regulator two or three years into the future at best?
     My opinion is that something needs to happen now. I see it as responsible to have rules and regulation around the Internet.
    As I mentioned in my statement, there's freedom, but then there are limits to freedom, because we need law and order. I don't see the issue with people having to verify who they are. Then, where the Criminal Code is concerned, if there's a perpetrator out there on the web, if he had to verify who he was in order to get on that site, it would be easy for police to then enforce—
    Thank you very much for that.
     I'm now going to go to Madam Dhillon for up to six minutes, please.
    Thank you, Madam Chair.
    My questions will be for Ms. Jordan-Smith.
    Thank you so much for coming today and sharing your painful testimony. We heard the painful testimony of the witnesses before as well. It's not easy to hear this, so I cannot even imagine what you and your family are going through, and your child as well. My heart breaks to hear of these things happening.
     I would like to say that the three core duties of this bill would be to act responsibly, to protect children, and to remove child sex abuse material and nude images shared without consent, including deepfakes.
    Can you please speak to the importance of these duties? What would you like to say?
     I think that would be adequate. I can't think of anything else necessarily to add at this stage.
    I think the sharing and the way that things live on the web needs to be stopped as well, because I know for Carol and Barbie, who were in the first panel, that part of the revictimization of victims is that these things are shared and reposted. I think there needs to be some language around preventing that from happening.
(1230)
     Do you think that if this legislation had existed at the time your daughter was abducted it would have made a difference in her life, or you would not be facing this today?
     I don't know to what extent we wouldn't be in the same position, but I certainly think that if tech companies were held responsible and actually moderated their sites, it would have been removed, so I think it could have been prevented by tech companies as well as by having an online safety component with a bill like Bill C-63.
     To follow up on that, we keep hearing about censorship from people who are alleging that this bill is a new form of censorship. That suggests that there is currently freedom for online platforms, and that there is control over what you see, but these platforms are the ones that have the control. It's not the users.
    Do you believe users have freedom on these platforms currently? What are your thoughts? What would you like to see done differently?
     I've heard the freedom-of-speech argument. My position on it, as I said, is that we like to think that we live in a society of free will, but within that there are frameworks of law and order in our society, and there needs to be some measure of control.
    I'm an advocate for having rules in place. I don't see it as an infringement or censorship. I think with hate speech, hate crimes, racial slurs or extensive online bullying, the police are doing their best to try to deal with those things, but I certainly think it would give some teeth to police to be able to enforce something, and it would set the ground rules for Canada on how we operate on the web, so it would be a cultural shift.
     Again, I have a follow-up question to that comment. I was actually going to ask you about this. The harms online also include those that induce a child to harm themselves or subject them to bullying.
    Why do you think it's important to include these as well in the bill?
     I think it's important because the result is that children are dying, and they're committing suicide through online bullying, or exploitation, or sextortion, and it's happening at an increasing rate. C3P put out a report that Snapchat and Instagram have the highest rates of predatory behaviour, and I know children on those platforms. People create content, thinking it's innocent, and I think, oh, that's your child dancing at the age of 12—how cute. However, I think there's a predator watching that.
     I don't know if you've been watching the justice committee over the last few weeks, getting to this bill. Why is it urgent? Filibusters have taken place. You talk about revictimization. The testimony of victims was read over and over again, over hours and hours, rather than addressing the bill.
    Why is it urgent to pass this bill, in your opinion? What are we risking by delaying the passage of this bill further and further?
     For me, I think we're behind the eight ball. I mentioned that the U.K. and Australia have more progressive legislation. So does the EU. I'm not suggesting that those are necessarily perfect pieces of legislation, but there has to be something in place, and something is better than nothing.
    I think it would give rules for how people operate on the web, especially even people who aren't doing harmful behaviour. However, when I think about even grown women I know who have received unsolicited nude...from grown men, or catfishing and things like that, there are victims of all of these things, and right now it is the Wild West. I think there needs to be something in place that gives people the legal framework for how to operate within this space.
     We keep hearing mention of a regulator over and over again in these questions. I would like to ask you this: How do you feel knowing that, without a regulator, victims would have to sue social media platforms to enforce the law? What would you like to say about that?
     I would say that there should be, again, more in place. We are victims, and we are in the middle of a lawsuit with the provider of a platform.
(1235)
    You would like to see Bill C-63 pass quickly, then.
     Yes, absolutely.
     Thank you so much.
     Thank you very much.

[Translation]

    Mr. Fortin, you have the floor for six minutes.
    Thank you, Madam Chair.
    Thank you, Ms. Jordan‑Smith, for being with us today. Your story is very troubling, like those from Ms. Todd and Ms. Lavers, whom we heard before you. Obviously, we will keep your experiences in mind all throughout our work on this important issue.
    Bill C‑63deals with the issue of online hate, as well as bullying and protecting images, among other things. The minister announced he would be dividing the bill. We can therefore hope to look more quickly into the issue of bullying and use of social media, specifically by passing the new Online Harms Act. That’s good news for us.
    For your part, did anyone speak to you about the idea of dividing Bill C‑63 in order to work more quickly on the Online Harms Act? If so, what did you think?

[English]

     I'm sorry. The interpretation of that question didn't come through. Is there somebody who can pose that question in English?
    It didn't come through at all.
    Have you...?
    As the chair, I'll ask, but then I'll have to turn it over to someone else who has more capabilities.
    Do you have—
    On the bottom, do you have it turned to the language of your choice? In this case, I guess it's “English”?
    It's like a little globe.
     I don't see where to do that.
    It's a little globe icon.
    Okay, someone will....
     Do you have it?
     No, I don't see it.
     It could be on the top. They change the layout sometimes.
     Someone will have to give you a call.
    Thank you.

[Translation]

    Mr. Fortin, could you restart, please?

[English]

    Can you hear me? Are you receiving the interpretation?

[Translation]

    Do you hear the interpretation when I speak to you in French?
    Are you talking to me?
    No.

[English]

    I'm asking Madam Jordan.
    Yes, I can.
    Thank you. I'm sorry.
    We're going to start you again, but I'm going to now really be tight on the minutes, because we have to conclude at one o'clock.

[Translation]

    Mr. Fortin, you have the floor.
    Thank you, Madam Chair.
    Ms. Jordan‑Smith, can you hear me?

[English]

     Yes, thank you.

[Translation]

    I think she is not hearing me, Madam Chair.

[English]

     She said “yes”.
    Did she say “yes”? I didn't hear it.

[Translation]

    Ms. Jordan‑Smith, did anyone explain how the device works before you started your testimony?

[English]

     I missed the interpretation part yesterday in my testing.

[Translation]

    She says there is no interpretation.

[English]

     It's working now.
    Is it working?

[Translation]

    Do you understand me, Ms. Jordan‑Smith? Do you hear the interpretation?

[English]

     I do, yes.

[Translation]

    Did anyone explain how the device works before your testimony?

[English]

    It wasn't covered in the testing yesterday, so I didn't know to set it to that.

[Translation]

    Very well. I just want to make sure it was properly explained to you. I am not blaming you. Witnesses must be told how interpretation works beforehand, because it is important for all Canadians, both those who speak French and those who speak English, to be able to hear your testimony. It is part of my role to make sure everyone fully understands you, because your testimony is important and must be understood by everyone. That said, I am aware it’s not necessarily obvious, when it is the first time.
    As I was saying earlier, I thank you for being with us. Your testimony is touching, like that from Ms. Todd and Ms. Lavers, who preceded you. We are aware of the seriousness of your daughter’s victimization. Rest assured we will keep it in mind throughout our work on Bill C‑63.
    The question I was asking you—before we realized you were not hearing the interpretation—was on Bill C‑63. The minister announced he could divide it so that we can work more quickly on every aspect of it, especially the issue of online harm. What is the most urgent, in my opinion, is protecting our children, and I think most of us feel the same way.
    What do you think about the idea of dividing Bill C‑63 in order to study the Online Harms Act and the issue of online hate separately?
(1240)

[English]

     For me, it's whatever is easiest to administer. If there are contentious components to Bill C-63, then I feel as though I'd capitulate to government folks who know how things are administered to extrapolate components of the Criminal Code or pieces that might be up for debate and then create other pieces of legislation that might work better within the system.
    I guess that's all I can really say on that topic. I don't see an issue with them being separated, so long as they're effective and they work within the system.

[Translation]

     Thank you, Ms. Jordan‑Smith.
    Bill C‑63provides for the creation of the Digital Safety Commission of Canada, the position of Digital Safety Ombudsperson of Canada and the Digital Safety Office of Canada.
    Are you aware of their respective roles? What do you have to say about them?

[English]

     Yes. I've read about it. I suppose, speaking candidly, I look at that as being a function of government. I sort of rely on what would actually work systemically. Those are components that I just don't have the expertise in. I suppose my experience is more boots on the ground and a lived experience that was absolutely horrific. I can just outline what the issues are in our case and in cases that I've seen.

[Translation]

     Thank you, Ms. Jordan‑Smith. Excuse me for interrupting you. I do not have a lot of time left.
    It was my understanding that the measure which seems most urgent to you is verifying users’ age on social media. Did I understand correctly? If not, could you specify which measure you think we should focus on?

[English]

     To me, age verification seems like a natural given, because I see it already happening. The other piece is mandating that the providers or operators are actually monitoring these sites. Right now there's an issue with AI as well. They're not detecting certain words that are sexually charged...not just racial or violent terms. In the case of my daughter, the stuff we'd found on the platform that she was attached to was still live up until a few months ago, when our lawyer had it removed. I think AI monitoring and age restriction and verification are key and fundamental to having some control over the Internet.

[Translation]

    Thank you.
    Beyond age verification, are there other aspects of the issue we should look into? For example, should we go further and ask the manufacturers of electronic devices—such as computers, telephones and tablets—to add a mechanism for controlling what appears on them? Obviously, I agree there is also the issue of platforms, which I did not raise. However, when it comes to the equipment, such as computers and telephones, do you think something else should be done as well?
(1245)

[English]

    Please be very brief, Madam, if you have a response to that.
     I mean, I'm not opposed to it. I wouldn't have an issue with it. Again, I think you'll run into other Canadians who say it's an infringement. I look at it as all being focused around safety. My perspective is that I would have no problem with it, and I know other people who wouldn't.
    Thank you very much.
    Mr. MacGregor, you have six minutes, please.
    Thank you, Madam Chair.
    I would like to welcome Mr. McSorley to the committee. He was a witness at my other committee, the public safety committee. We really appreciated that.
    In your exchange with Ms. Rempel Garner, the subject of “algorithmic transparency” came up. It's a term that I am familiar with and am very much interested in. When people are posting online on these platforms, the platforms are not just passive bystanders. Their algorithms can both amplify and suppress. Algorithms can be very useful. They can direct people towards their interests; they can help make searching much more efficient, but they can also push people down to some very dark corners. I think over the last number of years we have seen the real-world results of that.
    My colleague Peter Julian has come up with a bill, Bill C-292. I'm sure there's a variety of ways to approach this, but in terms of taking a more active role in promoting algorithmic transparency, how do you figure that fits into this subject matter that we're discussing today?
     I think it is very important, because as we address different forms of harms, we need to look at modelling different approaches. That's why, in our comments, we're not proposing changes in terms of addressing child sexual abuse material or other things, but focusing specifically around national security and anti-terrorism concerns.
    That said, in terms of algorithmic transparency, we think that it would be important to, overall, have a mandate for these platforms to have to be open about the development of their algorithms and what kind of information is being fed into them.
    As we've argued in other places around the current artificial intelligence and data act, there need to be third party assessments to ensure that these algorithms are doing their job, not only in ensuring that they're efficient in what they're being asked to do but also in ensuring that there aren't negative repercussions. We know that already, with the use of artificial intelligence and algorithms, there have been documented cases of bias around age, gender and race, so it's important that there be openness, and that's something that's missing from Bill C-63.
    Thank you.
    In your opening statement, you were talking, I think, about how anything posted on social media could be viewed in one context as inciting terrorist violence but in another as perfectly acceptable speech. Can you elaborate on this and maybe provide an example?
     Sure, definitely. I think that there are a few areas where we could look at that. We could look at that domestically, where there are individuals marching in the street for one cause—and we've been active on raising concerns about the characterization of any support for Palestinian human rights as being in support of terrorism—but a march for another issue that unfurls in the exact same way as the exact same call for an action would not be characterized that way. We think of the Occupy movement. There's a concern that, even though there's no direct call for violence, because of the stigma of simply labelling something as potentially being in support of terrorism, it could be viewed as a harm.
    I'd like to expand that, too, because one of our concerns is how this will be applied internationally. There are countries that would say that human rights defenders in Egypt or people resisting in Ukraine are defined by other countries as terrorists. How would the platforms be expected to decide how to monitor all that? If it was limited simply to incitement to violence without that subjective decision-making around it, it would be a lot more clear for the platforms. It would be clearer for the audience, and it would also be easier to challenge it when it comes to the digital safety commission if there are any issues.
(1250)
    In other words, as legislators, we need to very much pay attention to the subjective interpretations of the laws that we are proposing. That is very well taken.
    We've also had a lot of conversations about how we want platforms to take more responsibility for the content, but do you have concerns at all about platforms proactively monitoring all content and how they would deal with the collection and retention of private information? Can you elaborate on those concepts for the committee?
    One of the concerns we had with the original iteration of this bill was that it would have mandated platforms to have to, in fact, monitor essentially all content that was going up. It no longer does that, but we're concerned that it doesn't stop them from doing that. It doesn't block that. The reason for that is, if that were the case and they were to do that, they would by default have to rely almost primarily on algorithmic decision-making, and that's not included. As we said, transparency is included in the bill. It would almost by default result in an overmoderation. They would have to lean towards taking down content and dealing with it later, rather than narrowly defining it.
    In some cases, for child sexual abuse material, there are hashes and things that can be used to specifically identify particular types of content, and that would avoid having to monitor all online content, but that's missing from this bill in terms of an obligation for the platform to not engage in that activity.
     Thank you.
    Thank you very much, Mr. MacGregor.
    We still have a bit of time, and I'm going to shorten the time frame.

[Translation]

    I will divide the next rounds as follows: I give the Conservatives three minutes, the Liberals three minutes, the Bloc Quebecois a minute and a half and the Nw Democratic Party a minute and a half.
    Ms. Rempel-Garner, you have three minutes.
    Thank you, Madam Chair.

[English]

     I'm going to go back to you, Ms. Jordan-Smith, to pick up on a line of questions from my colleague Mr. Fortin.
    He asked you if you knew what the regulators did, and I think you gave a very succinct answer. You said that would be up to the government. It's concerning to me, though, that you don't know what they do. I'm not saying that pejoratively; I'm saying it from the perspective of a parent who's gone through so much loss. I feel that the stated goal of Bill C-63 is for you to know what protections you have upon its passage, but they don't exist, because all it does is create a regulator where there's no guarantee that the protections that you're asking for are going to be legislated by Parliament.
    In that, my preference would be that Parliament legislate that duty of care immediately, so that either law enforcement or existing regulatory bodies could take action immediately.
    Does that make sense to you?
     Yes, I think it needs to happen, because right now we have nothing.
    Exactly.
    There is a part of Bill C-63, in proposed section 4, where it talks about enhancing reporting requirements. Some of my colleagues have suggested that we need a regulator to do that. In the bill itself, it says that these reporting requirements would go to a law enforcement body that already exists.
    Would you support those provisions that are enhancing laws that already exist and that would go through law enforcement? Is that perhaps what the government should be focusing on while also ensuring that there's a legislative duty of care, so that if one of us asked you again whether you know what this law does or what protections you're afforded, you'd be able to answer that with a degree of certainty that brought you some peace in your heart?
     Just to clarify, what I said was that I don't know how it would be administered. I'm not ignorant of the bill, but I feel as though there has to be something in place that guides the citizens of Canada on how to engage with the Internet as well as providers. That's where I feel as though the bill is strong.
     When we talk about how it's levied out or whether there are ombudsmen or different regulating bodies, that's the piece that, I fully admit, I don't know how that would function. That could be a weakness.
     With my last 30 seconds, I want to thank you for all your work on this. I'd like to follow up with you after this meeting. I would like to send you a copy of Bill C-412, which actually specifies that in great certainty. It's a bill before Parliament that we could pass today and actually get these protections with some certainty for parents like you. I think that's what we all want to do here. We don't want to wait another two or three years.
     Thank you.
(1255)
    Thank you very much.
     Now, for the three minutes, I'm going to go to Madame Brière, please.
     Thank you, Madam Chair.
     I'd like to ask a question to Madam Lianna McDonald, if she's still online.
     Madam Todd is.
    Madam Todd, okay.
    Madame Todd, I would like to hear from you why the argument that this is an unnecessary bureaucracy is unfounded.
    What do you mean by “unnecessary bureaucracy”?
    We heard a lot about the regulator.
     I've been following what's been happening in Australia. I have actually met with the e-commission in Australia that does the regulatory administration. For all those who might not know, there was a question about what are the parts of a prospective e-commission.
    The digital safety commission of Canada would be a body that would oversee the enforcement of the online harms act. A digital safety ombudsperson would support users and advocate for the public interest of online safety. There would also be duties for social media operators, and platforms would be required to implement measures to mitigate, protect children and make harmful content inaccessible.
    It's a whole ball with different parts in the ball. That's sort of what's needed. It's not going to happen overnight, because in Australia it took years to come up with. We're doing this for long-term safety. We're not doing this for the short term. We want to do it right. Everything that we do takes time and care, really.
     What I'm not happy about is that, as parents, we are being asked questions that we might not know about. We've come here to talk about is why Bill C-63 is important to enact. This is one of the last First World countries to enact something like this. That's why we need to have it done. We do need the regulatory board, and the e-commission is a regulatory board. That's what I have to say about that one.
    Thank you.
    I know the question has been asked to someone else, but do you think that your daughter might still be alive today had Bill C-63 been law at the time?
    There are lots of moving parts with that, with the mental health aspect. However, remembering back in the days when the intimate images were posted on Facebook and through sites, I remember chasing down Facebook and trying to figure out where to report it, and I couldn't. There was no button back then. There was no email address. There was no phone number to contact.
    As I saw my child realize this and disintegrate before my eyes, it was heartbreaking. We're 12 years past this, and there are still things that are happening that are harmful to children and families that need to be changed. Technology has advanced so much that it makes it more challenging and more difficult. If we were back 12 years ago, I'm sure that regulators could have fixed these problems, but with AI, with the advancement of artificial intelligence now—
     Thank you for your testimony.

[Translation]

    Our next speaker is Mr. Fortin, who has a minute and a half.

[English]

     We will follow with one and a half minutes for Mr. MacGregor.

[Translation]

    Thank you, Madam Chair.
    Given the time allocated to me, I will go quickly.
    First of all, I want to thank you, Ms. Jordan-Smith and you too, Mr. McSorley, even though I did not ask you any questions. It does not mean your presence is unimportant. Your testimony was clear and I duly noted it.
    Ms. Jordan-Smith, if I may, I would just like to ask you one last question.
    We all hope the Online Harms Act, meaning Bill C‑63, will pass quickly. The bill proposes it and, in my opinion, there might be some adjustments to be made. However, I think we owe it to ourselves to be diligent. This will not solve all the problems, but it will criminalize certain behaviours and create entities for complaints and follow-up.
    In your opinion, would it help if funds were dedicated to awareness campaigns—be they on television, the radio or social media—to target our young men and young women and help protect them against this?
    I ask the question because they will be constantly facing these situations, no matter what laws we pass. In your opinion, could an awareness campaign in the media change anything for victims?
(1300)

[English]

     I mean, certainly it could help, but C3P already did an awareness campaign around the bill. That has been shared and on TV throughout the summer. My feeling is that we're of the digital era, and adults actually don't necessarily use social media appropriately.
    Thank you.
    Mr. MacGregor.
     Thank you, Madam Chair.
     Mr. McSorley, I'll turn to you again. With respect to encryption tools that are designed to protect online security and privacy, do you believe anything in Bill C-63 poses a risk to those?
    If you do have concerns, do you have any ideas on what we as a committee should be looking at in terms of addressing those concerns?
    I know that the question around encryption and private messaging is a live one in terms of how to approach it in this bill. Our concern is that once encryption is broken in private messaging, it can't be fixed. Once one actor is able to access encrypted information, it's possible that it will be able to be accessed for other reasons. Encryption really underlies so much of the security that we have online, from banking to even protecting the privacy of people of all ages.
    In the bill right now, there's no explicit requirement that platforms have to protect encryption. We're concerned that the lack of any acknowledgement of that could lead to platforms interpreting the bill as allowing them to do so. We think further protections should be included in the bill.
     Thank you for that clarification.
     Thank you.
    Thank you very much to the panellists who appeared here.
    Thank you as well to those who appeared in the first panel for bearing with us. Sometimes it happens that we have to suspend abruptly like that, without knowing in advance, but your persistence in staying with us is very much appreciated.
    On behalf of all the committee, we are very appreciative of your testimony. Our hearts are with you. Whether you encountered this 12 years ago or last year, it's still like yesterday.
    For all the parents who are with us, thank you so much for sharing your personal stories and for continuing to share them so that we don't forget and we do move as quickly as we can, as legislators, with a bill like this. Thank you very much.
    The last comment I'll make is that if there's anything you wish to say that you felt you did not have the appropriate time to say, we would welcome anything you would like to send to us in writing. I know that all of you have sent briefs already, so it's not necessary, but if there's anything you wish us to consider further, please send it to our committee through the clerk.
(1305)
    Thank you so much.
    Is it the will of the committee that we adjourn?
    Some hon. members: Agreed.
    The Chair: Thank you so much.
    The meeting is adjourned.
Publication Explorer
Publication Explorer
ParlVU