:
I call the meeting to order.
Welcome to meeting number 125 of the House of Commons Standing Committee on Justice and Human Rights.
Pursuant to Standing Order 108(2) and the motion adopted on December 2, 2024, the committee is meeting in public to begin its study of the subject matter of Bill , an act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other acts.
[Translation]
Before welcoming our witnesses this morning, I wish to call your attention to the presence in the room of Ms. Sokmony Kong, Secretary of the Cambodian division of the Assemblée parlementaire de la Francophonie. This parliamentary official was chosen by the Association des secrétaires généraux des parlements francophones, or ASGPF, in recognition of her very highly esteemed work within her organization. Ms. Kong chose the Parliament of Canada for her two-week professional development placement.
We wish you an excellent stay with us, Ms. Kong. As a former member-at-large representing America for the APF, I’m very pleased you chose Canada. I therefore wish you a good stay with us.
[English]
I would like to welcome our witnesses for the first hour. They are all appearing by video conference.
Before I say their names, I have a few reminders.
I'm going to ask colleagues in the room or by video conference to please wait until I recognize you by name before speaking, and to ensure you address your questions through the chair. Please do not take the floor until after you are recognized.
For witnesses participating by video conference, please ensure you have selected, on the bottom of your screen, the language of your choice.
[Translation]
I also want to say that all of the equipment belonging to the witnesses here with us this morning was tested and everything is working well.
[English]
As the chair, I want to make note of the fact that it is my responsibility, with the help of the clerk, to keep time as best we can in order to allow fairness for the witnesses, and for the members in the room asking questions, and also to suspend for a minute to allow one hour for the second group of panellists to be brought in.
I will now introduce them to you and ask each of them to give their opening remarks for up to five minutes.
With us this morning, from the Amanda Todd Legacy Society, is Madam Carol Todd, founder and mother.
[Translation]
We also welcome Ms. Lianna McDonald, executive director of the Canadian Centre for Child Protection.
[English]
We also have Carl Burke and Madam Barbie Lavers, who are participating together as individuals.
Now I will ask Madam Todd to please begin with her opening comments.
I'm speaking to you from Vancouver, British Columbia. I thank you for this invitation to participate in this prestudy session on Bill .
To start, the majority of what I'm going to say in the next five minutes and in answer to the questions are my thoughts and my thoughts only.
Today I must stress the importance of Bill , the online harms act. This bill is a comprehensive approach to addressing the growing concerns of harmful content on the Internet. Online safety, I feel, is a shared responsibility, and everyone—users, parents, educators and platforms—plays a role in creating a safer online world by ensuring protection, accountability and support.
My name is Carol Todd. I'm widely known as the mother of Amanda Todd. I am a teacher-educator in British Columbia with my work primarily centred on education on digital literacy, online safety and child abuse prevention, namely exploitation and sextortion. Providing children, teachers and families with the knowledge and skills to navigate the digital world is essential and is one of the reasons I created a legacy, a non-profit, in Amanda's memory.
My daughter, Amanda Todd, was a Canadian teenager whose tragic story brought international attention to the severe impacts of cyberbullying, online harassment and exploitation. She was born in November 1996 and faced relentless harassment both online and off-line as a young teenager. She ultimately took her life in October 2012. Knowingly, parents shouldn't outlive their children in preventable situations.
Amanda's ordeal began when she was 12 years old. She was persuaded by an online stranger to expose her breasts on a webcam. This individual saved the image and later used it to blackmail her, threatening to share the photos with her friends and family if she didn't perform more explicit acts. Despite changing schools multiple times, Amanda couldn't escape the harassment, and the blackmailer continued to follow her for two and a half years, creating fake profiles to spread the image and further humiliate her.
In September 2012, five weeks before Amanda took her own life, Amanda posted a YouTube video entitled “My story: Struggling, bullying, suicide, self-harm”, in which she showed flash cards to share her painful experiences. She detailed the bullying, physical assaults and severe emotional distress that she endured both online and off-line. The video went viral after her death, and currently it's been viewed about 50 million times across the world.
Amanda's death prompted significant public and governmental responses. In 2022, Aydin Coban, a Dutch man, was convicted of harassing and extorting Amanda in a Canadian court and sentenced to 13 years in prison. He is currently serving his Canadian time in the Netherlands.
Amanda's story continues to resonate, highlighting the urgent need for stronger protections against online harassment and better supports for victims of bullying, cyber-bullying and exploitation.
There are so many voices that remain unheard due to fear, judgment or shame, or because they can no longer speak. It is vital to let these silent voices be heard and to create a more compassionate and understanding world, where we help and not hurt.
Over the past decade, we have observed rapid changes in technology. We have watched devices that were a useful tool for communication turn into fun devices that can exploit and hurt others. Since its inception, the Internet has taken on darker tones. The word “algorithms” is now in our vocabulary, where it once never was.
Research has highlighted some of the harmful effects related to screen time. These effects include reduced well-being, mood disorders, depression and anxiety. These effects impact children and adults alike in a world filled with online media.
With increased access to the Internet comes easier access to violent and explicit online content that can impact sexual attitudes and behaviours, harm to children through the creation, sharing and viewing of sexual abuse material, and increased violence against women and girls, as well as sex trafficking.
Governments must take action to enact new laws and modify existing ones.
To make the online world safer, we must increase education and awareness. We must have stronger regulations and laws, like Bill . We have to improve the behaviours of the online platforms. We need parental controls and monitoring, and we need to encourage reporting like Cybertip.ca.
Bill —
:
Good morning, everyone.
Thank you very much to the committee for this opportunity.
My name is Lianna McDonald, and I am the executive director of the Canadian Centre for Child Protection, a registered charity that has been operating for nearly 40 years to protect Canadian children.
For the past 22 years, we have been operating Cybertip.ca, Canada's national tip line to report online crimes against children. In 2017, we launched Project Arachnid, an innovative online platform that targets the removal of child sexual abuse material at scale. It is through this critical work that we have witnessed first-hand and all too often the colossal injury and harm that happen every single day to children online. The unregulated Internet has basically destroyed childhood as we have historically known it, while children and families are paying a devastating price for the ongoing failure of government to regulate online spaces.
There has been a steady increase in the number and seriousness of online crimes against children since the rise of social media, which created a perfect storm of injury and harm to children. We saw another huge and significant jump after 2020 with COVID. These are key events that have exacerbated and intensified harm to our children.
We've handed children technology that has been weaponized against them by predators and technology services, and I'll underscore what I'm talking about. Every month, through Cybertip.ca, the tip line, we process over 2,500 reports, and these are reports by Canadians who know to come in to us. We've seen a 760% increase in luring reports since the start of COVID in 2020. We've managed more than 4,300 requests from youth and their caregivers in the last year alone. We receive approximately seven sextortion reports every single day at the tip line, and we've processed close to 4,000 sexually explicit deepfake images and videos of children. Finally, since 2017, we've issued over 40 million takedown notices to companies, to get them to take down child sexual abuse material.
There is no other entity in Canada that is doing the work we are doing. Regrettably and in a very difficult way, we are witnessing first-hand the scale of harm that is happening to our children and how it has evolved over the years. We are dealing with young people who are terrified about what an offender will force them to do next, youth who are frantically trying to get their child sexual material down, families who are dealing with situations that have escalated well beyond anything—anything—that they ever could have imagined.
We are supporting survivors of child sexual abuse material from all over the world. Abusive imagery of them is endlessly uploaded and re-uploaded on platforms available to anyone with an Internet connection. These victims and children have been stripped of their privacy and their dignity, and, in fact, they have no recourse. Their rights are repeatedly violated while the predators who obsess over them, the ones who stalk, harass and target them, are shielded by the cloak of anonymity that technology affords them.
To try to deal with this mess through Project Arachnid, we are issuing between 10,000 and 20,000 notices to companies every single day. These notices are overwhelmingly for known child sexual abuse material. By that, I mean imagery that has been circulating for years, tormenting survivors, yet still these platforms get to choose whether or not they take it down. They get to regulate themselves. They get to decide all on their own what is okay and what is not okay. It's outrageous, and it must change.
To put this all into perspective for you quickly as I close, I'll give you a sampling of the actual interventions that our organization deals with every day. Imagine—and this is happening—a young girl between the ages of 11 and 12 who is being tortured daily by a group of anonymous men. Every day, she is ordered to go into the school bathroom and is instructed to self-abuse and harm while recording the material. She is paralyzed by fear. She does as she's told. The requests get worse, more degrading, more harmful. Eventually, she reaches out for help.
Imagine that a teenage boy is tricked into sending a sexual image to a person he thought was a peer, but that person turns on him and threatens to send the image to all his friends and family. He is shocked. He is terrified. He believes with every bone in his body that they will do what they've said. He pays them. It's not enough. The threats keep coming. He is desperate to make it stop.
These are just a few of the examples that we hear. They are not hypotheticals—
:
Good morning. Thank you for inviting my husband and me to speak today.
We want to introduce our son to you today. Harry was a very outgoing and inclusive young man. He was intelligent and handsome. He was an athlete and a brother, and he was loved by his friends and his community.
Harry was a patriot. He loved his country. He joined the cadets at age 14. Then in grade 11, in fall 2022, Harry joined the Prince Edward Island Regiment. He was 16. He was doing his basic training in Summerside, Prince Edward Island, on the weekends, while going to Souris Regional School full time. He only had one weekend left to complete his basic training for the RCAC. He was so proud of Canada, and he planned to dedicate his life to serving his country.
I'm Barbie Lavers. My husband is Carl Burke. We are Harry's parents. Harry was 17 years old when we lost him to sextortion. As a family, we had many conversations with Harry and his sister Ella about safe online use and about the dangers of sharing images online. Unfortunately, our family was not aware of the word “sextortion”. We had never heard of it.
On April 24, Harry came to his dad and told him that he had screwed up. He had shared intimate pictures with a girl, supposedly his own age, from Nova Scotia. This individual was now demanding money, or they would share Harry's images with all of his contacts, and in particular with his commanding officer in the RCAC. Sadly, this individual did share some of the images with his friends in cadets, and Harry knew this. I was also contacted on Instagram by apparently the same individual, who told me they would ruin his life.
When Harry came to us that evening and told us what had happened, all four of us sat at the table, talked about it and made a plan to contact the local RCMP in the morning. We thought Harry was comfortable with this plan, but sadly, he wasn't.
On the morning of April 25, we were getting ready for our day. My husband went down to check on Harry. The sheets in his bed had been pulled back, but the bed was not slept in. He yelled to me, “Where is Harry?” I came running down the stairs. By this time, Carl was in the garage. He found Harry face down on the floor. He shot himself.
What I'm telling you here does not define or demonstrate, in any way, what we found, what we felt or how our family felt, or how our lives have been changed forever.
Just two weeks ago, two teen boys and a young man in P.E.I. were targeted for the under-reported global crime of sextortion. The boys were targeted on social media platforms, where the strangers posed as age-appropriate girls for sex photo swaps. This has to be stopped.
We as a family support Bill to protect our children. As advancements continue with technology and as access to devices continues, the risks to our children increase. We must work together as communities, as families and as governments, through user regulations and accountability, to reduce the online abuse of our children and to provide support to all of us.
Social media platforms must be held accountable. They must incorporate regulations to keep our children safe. Children like our Harry are dying. The evidence of harm to our children is abundantly apparent.
Our 17-year-old daughter Ella has a Facebook account. She is unable to access Marketplace on Facebook because she is under 18. If you or I were on Marketplace, occasionally you might get a pop-up that says a seller might not be from your country. Obviously, Facebook has the ability to review IP addresses from incoming messages to their system. Can we not use this for our children's safety?
Now is not the time to enact or to dramatize politics. Colours need not matter in this discussion. Our children are the most important issue here, not colours. This bill provides an opportunity to protect our children and to show political coalition. Our children are in crisis. Some could even say they're at war. It is not time for our children to be used as political pawns to show that one party is more correct than the other. A temporary alliance must be, and is, required to save our children.
The longer Bill remains a political issue, the more children we will lose. We beg you to please stop wasting time and do something to help save our children.
Ms. Todd, you just asked me if I could show you that approach, and I can.
There's a bill in front of Parliament right now, called Bill . It outlines a specific duty of care for online operators that says exactly what they have to do in this. It also specifies the regulatory body. If it was passed today, it could be enacted today, and we could have immediate impacts.
That's my concern with Bill . It takes this responsibility and puts it into a regulator that hasn't been built. It also gives online platforms the ability to wiggle out of this two, three or four years in the future. My concern is with regard to how many more kids are going to experience this and have detrimental impacts.
Therefore, I would direct your attention to Bill . However, with the time I have left, I'd like to just ask some questions on whether you think some high-level things that are in there would be a good approach. First of all is the immediate updating of Canada's non-consensual distribution of intimate image laws to include images created by artificial intelligence, otherwise known as deep nudes.
Do you think we need to do that today, Ms. Todd?
:
Thank you, Madam Chair. I want to thank you and the members around the table for allowing us to do this study, particularly those who voted in favour of proceeding with it.
As well, I want to thank the witnesses for their powerful and important presentations today.
I just want to highlight, so that everybody knows, and I think everybody is aware, that the announced yesterday that we intend to split Bill into two parts, with the digital safety and child protection measures separated from the measures that focus on hate. I'd like to get on record that we've agreed to start with a prestudy of three meetings, but I believe that we should continue with three to six meetings on part 1 of the bill. This means a focus on the online harms act and the amendments to the mandatory reporting act. Then we can proceed with a second study, on the balance of the bill, at a later date.
I do have questions for the witnesses. I just want to emphasize our gratitude to all of you for being here, because we know it is incredibly difficult to share your stories in this fashion or in any other fashion. You have our gratitude and respect.
Child sexual abuse in Canada is currently illegal. Law enforcement can and should deal with horrible content, as Ms. Rempel was saying. However, as you said, Ms. Lavers, we need to depoliticize this, and the Criminal Code amendments alone are not enough.
What Bill would do.... I'll just be clear: A number of the issues that Ms. Rempel Garner was referring to are included in Bill C-63, so I think people need to understand that.
My question to all of you is this: If we were to proceed with just the Criminal Code measures alone, without the digital safety framework, would that be enough to address the problems we're talking about today, in your opinion? I put the question to all of you.
:
I would say a few things there.
First off, I know this will sound odd, but we feel that the 24-hour timeline is still too long. We know this because we can see companies removing material within minutes.
The fact of the matter is that right now there is no accountability; there is no oversight, so we are allowing companies to decide whether or not they are going to agree, whether they want to comply, whether they're going to ignore. There's no accountability. Obviously, we need regulation. Obviously, we need an accountability structure. There need to be consequences for these companies.
I will also note the level of recidivism that we see. By that I mean that when we have issued a notice to a company about a particular image, we'll see that same image reappear on that same platform. There's absolutely no excuse for that. There is no incentive, and there's no accountability. This is why they do what they want.
:
Thank you, Madam Chair.
I thank all the witnesses for being here today.
Ms. Todd and Ms. Lavers, your testimony is troubling. Even if we know about it, we don’t realize the effect this type of situation can have; we don’t realize the impact social media now has on our children. I consider it our duty as legislators to look into the issue and find the most appropriate solutions.
I know it is difficult for you to tell these stories. I don’t know how to describe it all, but it is the saddest of tragedies. Thank you for having the courage to testify before the committee.
That said, as her speaking time was up, I believe Ms. Todd was unable to finish her statement.
Ms. Todd, I would like to give you a minute or two to finish your testimony, if you please.
:
Thank you for allowing me to do this.
I will continue about why I feel that Bill is important.
I also want to say that we aren't the only country that has afforded this. The U.K. has an Online Safety Act that was established and written into law in 2023, and Australia had the Online Safety Act put into law in 2021. Also, the EU has an online harms act that is similar to what Canada is doing. Canada has been in collaboration with the U.K., Australia and the EU regarding Bill.
Why is this important? It's important because it protects children. What I don't understand—and this is from my own thinking—are all the people who are negative on Bill , saying that it's not about children and it's not about protection. They focus on the parts that has said he and his cabinet would rewrite. It is about protecting children. It's about protecting children and families from the online behaviours of others.
We can't do this without the tech companies' help. It's really important that we understand this. There are so many people who don't understand this. I read the negative comments, and, personally, it just infuriates me, because my daughter died 12 years ago, and I've waited 12 years for this to happen. Parliamentarians and political groups are arguing about this not being necessary, and we're going.... It just hurts me. It hurts me as a Canadian.
We need accountability and transparency. We need to support the victims. Passing Bill is not just about regulation; it's about taking a stand for the safety and dignity of all Canadians. This about ensuring that our digital spaces are as safe and respectful as our physical ones.
By supporting this bill, we are committing to a future in which the Internet is a place of opportunity and connection, free from threats of harm and exploitation. Passing Bill would demonstrate the federal government's commitment to adapting to the digital age and ensuring that the Internet remains a safe space for all users. It balances the need for free expression with the imperative to protect individuals from harm, making it a necessary and timely piece of legislation.
It's also essential to recognize the collective effort in creating platforms that address the challenges faced by children, women and men.
We've come to realize that what happened to Amanda could happen to anyone. As Amanda herself said, “Everyone has a story.” When these stories emerge, and they belong to your child, your relatives or your grandchildren, they carry more weight.
No one is immune to becoming a statistic, and, as I have previously shared, I have waited 12 years for this, because on day one of Amanda's death, I knew things needed to change in terms of law, legislation and online safety. I can't bring my child back, but we can certainly keep other children safe.
Thank you for this time.
:
Thank you very much, Madam Chair.
I'd like to echo colleagues in thanking the witnesses for joining our committee and helping us wade through a very difficult subject. I'm a father of three daughters. I have 12-year-old twins, so we are dealing with that as parents, with them getting access to the Internet, and the challenges of finding ways to allow them to do that safely.
Ms. Todd and Ms. Lavers, I'd like to start with you, because part of the debate on the subject of Bill has been on whether we should just modernize existing laws and changes to the Criminal Code or whether we should add another layer of bureaucracy.
Briefly, when you had your experiences in reporting this to the police and when the police were trying to make use of existing Criminal Code provisions to solve this for your children, can you talk about some of the limitations you experienced with that and illustrate why you think more is needed based on your personal experiences?
In our experience, the RCMP worked with the FBI in the United States, but tracking down the IP address of who had contacted Harry was difficult. When they did track it down, it was basically like a call centre type of set-up, and people worked there to extort and sextort. This is a job, just as if they were working at Bell Aliant and taking calls, but they're calling out, and they search for people.
I don't think that just having the Criminal Code is enough, as Lianna said. I think there have to be stronger guidelines and regulations in order to hold these companies accountable, because they could do it now if they wanted; they have the ability. I have no doubt in my mind that they do, but they don't want to do it, because they use the algorithms that they have to make money and not to keep people safe.
:
My thinking as a teacher-educator—and I speak to parents, teachers and communities—is that there's an aspect of prevention, intervention and reaction, and legislation becomes a reactionary phase: “Something's happened, and what are we going to do next?” We need more prevention and intervention.
When I first had to report when this was happening to Amanda, and I reported it to our local RCMP, it was a very challenging and difficult situation. You have to remember that all this started 14 years ago, two years prior to her death. It came back to me that they couldn't find the IP address coming out of the States. It was under a VPN, and they couldn't find anything. This was when she was alive.
After she died, through an investigation in the Netherlands and the U.K., they found an IP address for a fellow who was victimizing other young girls, and this happened to be Amanda's predator. Through finding information on Facebook, Amanda's name popped up under the account that she had. Ultimately, the Dutch police contacted the Canadian RCMP, and that's how Amanda's predator got caught.
Things have changed in the last 12 years, and I understand that, but there needs to be more incentive for law enforcement to take on these cases.
:
Thank you very much for that, and just because I'm running out of time, I would like to get to you, Ms. McDonald.
On your website, your organization has a statement that “exclusion of private messaging features...leaves a substantial threat to children unaddressed.”
I'm curious about how we approach this, because, of course, there are great privacy concerns in place now. My 12-year-olds are using children's messenger, so we have full control over their contact list, and, in fact, the parents of their friends also have full control, so we have a lot of oversight.
In what ways would you like the law to be crafted to address what you think is a glaring omission in this bill while still respecting the very real privacy concerns that have been raised with the potential of such an approach?
:
I am now going to start the process.
Ms. Haugen, you will get a phone call from the clerk or somebody from the room regarding interpretation, if you don't mind answering that.
I will welcome our witnesses. We have two witnesses by video conference and one in the room.
We have Madam Frances Haugen, advocate, social platforms transparency and accountability; and we have Madam Miranda Jordan-Smith, executive, both by video conference.
With us in the room, from Coalition pour la surveillance internationale des libertés civiles, we have Mr. Tim McSorley, national coordinator.
Please wait, each of you, until I recognize you by name before speaking.
For those participating by video conference, please ensure that you have selected, on the bottom of your screen, the language of your choice, because questions will be coming in both languages.
I also ask that you wait to be asked to speak, whether you're a member or a witness, and that you go through the chair.
I will now ask Madam Miranda Jordan-Smith to please commence.
You have up to five minutes.
:
Thank you for having me here today.
As mentioned, my name is Miranda. I'm here today to represent the astronomical and increasing number of victims who have been subjected to online harm. Please allow me to share with you the story of my daughter's abuse.
At the age of 12, my daughter had a cellphone, which we ensured was equipped with parental controls. She was not on social media at all. Her screen time was limited, and her contacts needed to be approved. Her father could see all of the activity on her phone.
Therefore, it was shocking to us to learn that our daughter, at the age of 12, could be groomed and manipulated online on a school device that carried a music platform that did not have any age restrictions. It had a chat function, like many, and it was not monitored adequately by the tech provider to detect the online predator she was speaking to. For one year, she was groomed by an online predator, who presented as a peer.
In June 2022, at the age of 13, she was abducted right beside her school by the predator, a 40-year-old man. When my daughter did not arrive home on the school bus, I reported her as missing.
From there, a full-scale search for her ensued, with volunteer crews on the ground, knocking on doors and putting up posters. The police in Edmonton merged their historical crimes, missing persons, cybercrime and human trafficking divisions in the hope that our daughter would be found safe.
For days, we had sleepless and tearful nights, wondering what happened to her. We engaged the media heavily, and our appeals made international news, with the New York Post and the U.K.'s Guardian.
After a week of our daughter missing, I woke to officers at our door, knowing that they had an update. We knew that either they had found her alive or our daughter would be returned to us in a body bag.
Naturally, we were overjoyed to learn that our daughter was found. The FBI had seized her from a hotel room in Portland, Oregon, and she was being held at a children's hospital there, where they administered a rape kit and an assessment of her abuse. Immediately, we jumped on a plane to retrieve her from Portland, and we brought her home.
While the criminal case is still pending, with a federal trial date set for January 13, 2025, the abuse that my daughter suffered is unbearable, impossible to comprehend. Her perpetrator faces 70 to 77 years in prison for a litany of crimes, some of which include kidnapping, rape, sodomy, putting a child on display, possessing and developing child pornography, and crossing an international border with sexual intent.
My daughter was stuffed into the perpetrator's trunk, and this act alone could have killed her.
For the last two years, my family has been on a healing journey. The pain and the damage of these horrific events is complex and largely irreparable. We are learning to coexist with it.
Today I appeal to you to understand the damage of an unregulated Internet and what it creates. Tech companies need to be held accountable and ensure they are acting in a legal and ethical manner. The online harms bill is a step in the right direction.
While I know that some people feel regulation is an infringement on one's freedom of speech or privacy, I must tell you that my family has no privacy and no anonymity. Everyone knows who we are now, and we have to live with judgment or misconceptions around, “This could not happen to my child,” or that our daughter is somehow gullible, or that she comes from a poor socio-economic background, all of which are not true.
I often think about regulation. To drive a car, one needs a licence. To fish or hunt, one needs a licence. To go into a porn shop and access pornographic material, one must produce identification. Why is the Internet not regulated the same way, so that users have to verify who they are?
I think it's time for online reform in Canada, otherwise more children will become victims. The impact is great for families and communities across the country. Already, the U.K. has progressive legislation, and Australia just passed regulation that social media users must be 16.
I appeal to you today, as members of Parliament, to make changes that will have a profound and lasting impact for the citizens of Canada, because it is my position and my lived experience that no child is safe on the web. If this can happen to us, it could happen to anyone.
Thank you.
:
Thank very much, Chair.
Thank you to the committee for this invitation to speak to Bill .
I'm grateful to be here on behalf of the International Civil Liberties Monitoring Group, a coalition of 44 Canadian civil society organizations that work to defend civil liberties in the context of national security and anti-terrorism measures.
The provisions of this bill, particularly in regard to part 1 of the online harms act, are vastly improved over the government's original 2021 proposal, and we believe that it will respond to urgent and important issues. However, there are still areas of serious concern that must be addressed, especially regarding undue restrictions on free expression and infringement on privacy.
This includes, in part 1 of the act, first, the overly broad definition of the harm of “content that incites violent extremism or terrorism” will lead to overmoderation and censorship. Further, given the inclusion of the online harm of “content that incites violence”, it is redundant and unnecessary.
Second, the definition of “content that incites violence” itself is overly broad and will lead to content advocating protest to be made inaccessible on social media platforms.
Third, the act fails to prevent platforms from proactively monitoring, essentially surveilling, all content uploaded to their sites.
Fourth, a lack of clarity in the definition of what is considered “a regulated service” could lead to platforms being required to break encryption tools that provide privacy and security online.
Fifth, proposed requirements for platforms to retain certain kinds of data could lead to the unwarranted collection and retention of the private information of social media users.
Finally, seventh, there has been little consideration on how this law will inhibit the access of Canadians and people in Canada to content shared by people in other countries.
Briefly, on part 2 of the act, this section amends Canada's existing hate-crime offences and creates a new stand-alone hate crime offence, and it is only tangentially related to part 1. It has raised serious concerns among human rights and civil liberties advocates in regard to the breadth of the offences and the associated penalties. We've called for parts 2 and 3 to be split from part 1 in order to be considered separately, and we're very pleased to see the government's announcement yesterday that it intends to do just that.
I'd be happy to speak to any of these issues during questions, and I've submitted a more detailed brief to the committee with specific amendments on these issues. However, I'd like to try to focus in the time I have on the first two points that I've made regarding “content that incites violent extremism or terrorism”, as well as a definition of “content that incites violence”.
The harm of “content that incites violent extremism or terrorism” is problematic for three reasons and should be removed from the act. First, it is redundant and unnecessary. The definitions of “content that incites violent extremism or terrorism” and “content that incites violence” are nearly identical, the major difference being that the first includes a motivating factor for the violence it is attempting to prevent. These two forms of harms are also treated the same throughout the online harms act, including requirements for platforms to retain information related to these harms for a year to aid in possible investigations.
Moreover, and maybe most importantly, incitement to violence alone would clearly capture any incitement to violence that arises from terrorist or extremist content. Further definition of what motivates the incitement to violence is unnecessary.
Second, if included, incitement to terrorism will result in the unjustified censorship of user content. “Terrorism”, and with it “extremism”, are subjective terms based on interpretation of the motivations for a certain act. The same opinion expressed in one context may be viewed as support for terrorism and therefore violent, while, in another, it may be viewed as legitimate and legally protected political speech.
Acts of dissent become stigmatized and criminalized not because of the acts themselves but because of the alleged motivation behind the acts. As we have seen, this leads to unacceptable incidents of racial, religious and political profiling in pursuit of fighting terrorism.
Studies have also extensively documented how social media platforms already overmoderate content that expresses dissenting views under the auspices of removing “terrorist content”. The result is that, by including terrorism as a motivating factor for posts that incite violence, the act will be biased against language that is not, in fact, urging violence but is seen as doing so because of personal or societal views of what is considered terrorism or extremism.
I note also that “extremism” is not defined in Canadian law. This ties into the third key part that we're concerned about, and that's that parts of the language used in this definition are undefined in Canadian law or the Criminal Code. This contradicts the government's main justification for all seven harms—that they align with the Criminal Code and do not expand existing offences.
Ms. Jordan-Smith, thank you for your testimony today and for your courage in speaking out on this issue.
One thing that struck me about your testimony was that you talked about how your daughter was victimized through a platform that you weren't even aware she was using. It strikes me that in order to have a duty of care that would address the fact that technology changes all the time—there will always be some new platform that kids are on—we need to have a very clear but also broad definition of who, or what, a duty of care would apply to. It can't just be Meta or a couple of the known players, can it?
I've been giving some thought to what that could mean. I tend towards having a broader term. The term I would like to use is something like “online operator”, which would mean the owner or operator of a platform, such as a service online or application that connects to the Internet, or that is used or that reasonably could be expected to be used by a minor, including a social media service and an online video gaming service, so that it's very clear that as new platforms come up in the future, as technology changes, you as a parent aren't having to guess whether or not your child is being exposed to a platform that might not be covered by the law.
Would you support that type of recommendation?
:
I'm glad you brought this up, because it was actually my next question. It's a question between you and Mr. McSorley.
The government, in Bill , has not thought about age verification at all. It's punting this to a regulator that's not created, and it's going to be two or three years down the road.
Witnesses on the other panel have suggested that age verification can be done right now through algorithms, and I agree with that. You can detect someone's age using an algorithm. If Meta knows somebody wants to buy a KitchenAid spatula, it knows how old they are.
I'm wondering, between the two of you, if the way that we should be squaring the circle on age verification to protect personal information, while also ensuring that minors are not subjected to harm, is by requiring online operators to use algorithms or other technological means to determine age within a degree of accuracy.
Does that make sense to you, Ms. Jordan-Smith?
:
Thank you, Madam Chair.
My questions will be for Ms. Jordan-Smith.
Thank you so much for coming today and sharing your painful testimony. We heard the painful testimony of the witnesses before as well. It's not easy to hear this, so I cannot even imagine what you and your family are going through, and your child as well. My heart breaks to hear of these things happening.
I would like to say that the three core duties of this bill would be to act responsibly, to protect children, and to remove child sex abuse material and nude images shared without consent, including deepfakes.
Can you please speak to the importance of these duties? What would you like to say?
:
Thank you, Madam Chair.
Thank you, Ms. Jordan‑Smith, for being with us today. Your story is very troubling, like those from Ms. Todd and Ms. Lavers, whom we heard before you. Obviously, we will keep your experiences in mind all throughout our work on this important issue.
Bill deals with the issue of online hate, as well as bullying and protecting images, among other things. The minister announced he would be dividing the bill. We can therefore hope to look more quickly into the issue of bullying and use of social media, specifically by passing the new Online Harms Act. That’s good news for us.
For your part, did anyone speak to you about the idea of dividing Bill C‑63 in order to work more quickly on the Online Harms Act? If so, what did you think?
:
Very well. I just want to make sure it was properly explained to you. I am not blaming you. Witnesses must be told how interpretation works beforehand, because it is important for all Canadians, both those who speak French and those who speak English, to be able to hear your testimony. It is part of my role to make sure everyone fully understands you, because your testimony is important and must be understood by everyone. That said, I am aware it’s not necessarily obvious, when it is the first time.
As I was saying earlier, I thank you for being with us. Your testimony is touching, like that from Ms. Todd and Ms. Lavers, who preceded you. We are aware of the seriousness of your daughter’s victimization. Rest assured we will keep it in mind throughout our work on Bill .
The question I was asking you—before we realized you were not hearing the interpretation—was on Bill C‑63. The minister announced he could divide it so that we can work more quickly on every aspect of it, especially the issue of online harm. What is the most urgent, in my opinion, is protecting our children, and I think most of us feel the same way.
What do you think about the idea of dividing Bill C‑63 in order to study the Online Harms Act and the issue of online hate separately?
:
Thank you, Madam Chair.
I would like to welcome Mr. McSorley to the committee. He was a witness at my other committee, the public safety committee. We really appreciated that.
In your exchange with Ms. Rempel Garner, the subject of “algorithmic transparency” came up. It's a term that I am familiar with and am very much interested in. When people are posting online on these platforms, the platforms are not just passive bystanders. Their algorithms can both amplify and suppress. Algorithms can be very useful. They can direct people towards their interests; they can help make searching much more efficient, but they can also push people down to some very dark corners. I think over the last number of years we have seen the real-world results of that.
My colleague Peter Julian has come up with a bill, Bill . I'm sure there's a variety of ways to approach this, but in terms of taking a more active role in promoting algorithmic transparency, how do you figure that fits into this subject matter that we're discussing today?
:
I think it is very important, because as we address different forms of harms, we need to look at modelling different approaches. That's why, in our comments, we're not proposing changes in terms of addressing child sexual abuse material or other things, but focusing specifically around national security and anti-terrorism concerns.
That said, in terms of algorithmic transparency, we think that it would be important to, overall, have a mandate for these platforms to have to be open about the development of their algorithms and what kind of information is being fed into them.
As we've argued in other places around the current artificial intelligence and data act, there need to be third party assessments to ensure that these algorithms are doing their job, not only in ensuring that they're efficient in what they're being asked to do but also in ensuring that there aren't negative repercussions. We know that already, with the use of artificial intelligence and algorithms, there have been documented cases of bias around age, gender and race, so it's important that there be openness, and that's something that's missing from Bill .
:
Thank you, Madam Chair.
[English]
I'm going to go back to you, Ms. Jordan-Smith, to pick up on a line of questions from my colleague Mr. Fortin.
He asked you if you knew what the regulators did, and I think you gave a very succinct answer. You said that would be up to the government. It's concerning to me, though, that you don't know what they do. I'm not saying that pejoratively; I'm saying it from the perspective of a parent who's gone through so much loss. I feel that the stated goal of Bill is for you to know what protections you have upon its passage, but they don't exist, because all it does is create a regulator where there's no guarantee that the protections that you're asking for are going to be legislated by Parliament.
In that, my preference would be that Parliament legislate that duty of care immediately, so that either law enforcement or existing regulatory bodies could take action immediately.
Does that make sense to you?
:
I've been following what's been happening in Australia. I have actually met with the e-commission in Australia that does the regulatory administration. For all those who might not know, there was a question about what are the parts of a prospective e-commission.
The digital safety commission of Canada would be a body that would oversee the enforcement of the online harms act. A digital safety ombudsperson would support users and advocate for the public interest of online safety. There would also be duties for social media operators, and platforms would be required to implement measures to mitigate, protect children and make harmful content inaccessible.
It's a whole ball with different parts in the ball. That's sort of what's needed. It's not going to happen overnight, because in Australia it took years to come up with. We're doing this for long-term safety. We're not doing this for the short term. We want to do it right. Everything that we do takes time and care, really.
What I'm not happy about is that, as parents, we are being asked questions that we might not know about. We've come here to talk about is why Bill is important to enact. This is one of the last First World countries to enact something like this. That's why we need to have it done. We do need the regulatory board, and the e-commission is a regulatory board. That's what I have to say about that one.
:
Thank you, Madam Chair.
Given the time allocated to me, I will go quickly.
First of all, I want to thank you, Ms. Jordan-Smith and you too, Mr. McSorley, even though I did not ask you any questions. It does not mean your presence is unimportant. Your testimony was clear and I duly noted it.
Ms. Jordan-Smith, if I may, I would just like to ask you one last question.
We all hope the Online Harms Act, meaning Bill , will pass quickly. The bill proposes it and, in my opinion, there might be some adjustments to be made. However, I think we owe it to ourselves to be diligent. This will not solve all the problems, but it will criminalize certain behaviours and create entities for complaints and follow-up.
In your opinion, would it help if funds were dedicated to awareness campaigns—be they on television, the radio or social media—to target our young men and young women and help protect them against this?
I ask the question because they will be constantly facing these situations, no matter what laws we pass. In your opinion, could an awareness campaign in the media change anything for victims?
Thank you very much to the panellists who appeared here.
Thank you as well to those who appeared in the first panel for bearing with us. Sometimes it happens that we have to suspend abruptly like that, without knowing in advance, but your persistence in staying with us is very much appreciated.
On behalf of all the committee, we are very appreciative of your testimony. Our hearts are with you. Whether you encountered this 12 years ago or last year, it's still like yesterday.
For all the parents who are with us, thank you so much for sharing your personal stories and for continuing to share them so that we don't forget and we do move as quickly as we can, as legislators, with a bill like this. Thank you very much.
The last comment I'll make is that if there's anything you wish to say that you felt you did not have the appropriate time to say, we would welcome anything you would like to send to us in writing. I know that all of you have sent briefs already, so it's not necessary, but if there's anything you wish us to consider further, please send it to our committee through the clerk.
Thank you so much.
Is it the will of the committee that we adjourn?
Some hon. members: Agreed.
The Chair: Thank you so much.
The meeting is adjourned.