:
I call the meeting to order.
[English]
Welcome to meeting 126 of the House of Commons Standing Committee on Justice and Human Rights.
At this point in the meeting, I'd like to propose the adoption of the Bill prestudy budget in the amount of $23,250. I understand that budget was previously distributed to all members.
Can I see a show of hands in support?
(Motion agreed to)
The Vice-Chair (Mr. Larry Brock): That's unanimous. It's adopted.
Thank you.
Pursuant to Standing Order 108(2) and the motion adopted on December 2, 2024, the committee is meeting in public to begin its study on the subject matter of Bill , an act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act, and an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other acts.
At this juncture, I'd like to welcome our witnesses for the first hour.
Appearing by video conference, on behalf of the Quebec Bar, we have Catherine Claveau, president; Nicolas Le Grand Alary, secretariat of the order and legal affairs; and Michel Marchand, member, criminal law expert group.
Appearing in person is Madame Anaïs Bussières McNicoll, director, fundamental freedoms program, Canadian Civil Liberties Association.
Witnesses and members, please wait until I recognize you by name before speaking. For those participating by video conference, please ensure that you have selected the language of your choice for simultaneous interpretation, which is on the bottom left of your screen, and please mute yourself when you are not speaking.
I remind all members to take the floor only after being recognized by the chair.
Without any further delay, the floor is yours. Each witness has five minutes.
Who would like to start?
Perhaps Madame Bussières McNicoll can start.
:
Honourable members of the committee, good afternoon.
The Canadian Civil Liberties Association, or CCLA, appreciates the opportunity to share its view on Bill .
The CCLA is an independent, national non-governmental organization founded in 1964 with a mandate to defend and foster the civil liberties, human rights and democratic freedoms of all people across Canada. We work to achieve strong protections for freedom of expression, privacy and principles of fundamental justice. That work is central to our mandate.
The CCLA recognizes the importance of legislative measures to protect some of the most vulnerable members of society from especially harmful forms of online speech. In that sense, the CCLA recognizes that some of the duties established under part 1 of the bill for operators of a regulated service are welcome. However, the current iteration of the online harms act also sets out broader duties that need to be clarified and limited appropriately. Otherwise, they will give rise to problems in relation to freedom of expression.
For example, the general duty set out in subsection 55(1) of the proposed act requires operators to implement measures that are adequate to mitigate the risk that users of the service will be exposed to harmful content on the service. The scope of the provision is too vague. In the absence of proper parameters, operators will likely try to fulfill the unspecific duty as efficiently and economically as possible, potentially at the expense of users' freedom of expression. For instance, operators could proactively monitor content, which at this point is not prohibited under the new act, or they could take down content as determined by non-transparent algorithms.
The general duty imposed on operators to implement tools and processes to flag harmful content, as per section 59 of the proposed act, has similar flaws, which would likely jeopardize freedom of expression as well. As it is written, the online harms act would allow operators to remove various types of flagged content, without giving the user who posted the content an opportunity to present their view. In fact, as written, the proposed act would even implicitly allow operators to remove various types of flagged content without first having to determine whether the content was indeed harmful.
The first three recommendations in our written submission to the committee address these concerns. We recommend that operators, in their efforts to fulfill their statutory duties, be prohibited from engaging in mass surveillance and unduly limiting users' freedom of expression. We also recommend that the newly created body in the bill, the digital safety commission of Canada, be required to check annually that operators are fulfilling their duties as they relate to users' rights.
The CCLA applauds the justice minister's recently announced plan to remove parts 2 and 3 from this bill. This addresses a joint request made months ago by the CCLA and a number of civil society groups to ensure that the committee's study of part 1 was not overshadowed by controversial changes to the Criminal Code and the Canadian Human Rights Act. The CCLA is of the view that Parliament should not pass parts 2 and 3 of the bill.
With respect to the proposed Criminal Code amendments, the new hate-motivated offence would irrationally increase the maximum sentence associated with any offence in Canada to life imprisonment. This excessive judicial discretion paves the way for disproportionate sentencing and an increase in plea bargaining by innocent and vulnerable defendants. It would also hinder free speech in Canada.
The CCLA also objects to the new “fear of hate propaganda offence or hate crime” provision. Criminal law should be a means of holding individuals accountable for what they have done, not for what others fear they might do. Allowing a judge to limit the freedom and expression of an individual who is not even suspected or accused of having committed a crime, let alone convicted of one, unreasonably and unjustifiably infringes on several rights protected by the Canadian Charter of Rights and Freedoms.
Lastly, I will turn to part 3 of the bill, the amendments being proposed to the Canadian Human Rights Act. The CCLA is of the view that the proposed amendments are neither an appropriate nor effective way to address the problem of hate speech in our modern society. The amendments would result in an onslaught of complaints to human rights organizations, which are already chronically under-resourced.
Thank you.
I would be pleased to answer your questions.
Good afternoon, members of the committee.
My name is Catherine Claveau, and I am the president of the Barreau du Québec. Joining me from the Barreau du Québec are Michel Marchand, member of the criminal law expert group; and Nicolas Le Grand Alary, lawyer, secretariat of the order and legal affairs. Thank you for giving the Quebec bar association the opportunity to comment on Bill .
Given our experience in criminal law and human rights, our remarks will focus solely on parts 2 and 3 of the bill, the proposed amendments to the Criminal Code and the Canadian Human Rights Act.
Let's start with part 2, the Criminal Code amendments. With the significant rise in hate crimes, most of which are based on race and ethnic origin, it is paramount that the bill provide the courts with the tools to respond effectively, while ensuring they adhere to the principles of fundamental justice and Canada's constitutional requirements. That is why the Barreau du Québec supports the Quebec justice minister's call for lawmakers to remove the religious exemption in the Criminal Code for hate propaganda.
The Quebec bar association considers it essential to codify a definition of hate. On one hand, this would encourage people to report incidents while helping communities clearly understand what is prohibited. On the other, it would give all actors in the justice system, police, in particular, a clear framework within which to operate.
However, we have concerns about the definition being proposed in the bill for the term “hatred”, which is based on the decision in Whatcott. In the case, the Supreme Court of Canada ruled on the constitutionality of a human rights provision prohibiting hate publications. The Quebec bar association considers the key decision in criminal matters to be the 1990 decision in Keegstra. The Supreme Court relied on the analysis in Keegstra in Mugesera in 2005.
In both decisions, the Supreme Court interpreted hatred in view of the Criminal Code provisions and found that “‘hatred’ connotes emotion of an intense and extreme nature that is clearly associated with vilification and detestation.” The provision could be subject to a constitutional challenge, and since the burden of proof in criminal law is not the same as it is in civil law and since individuals accused of a crime are guaranteed certain rights under the Canadian Charter of Rights and Freedoms, we recommend that the bill apply the definition relied on in those decisions.
In addition, the bill makes it a hate crime to commit an offence under the Criminal Code or any other act of Parliament if the commission of the offence is motivated by hatred based on certain factors. Someone guilty of the new offence would be liable to imprisonment for life. The new provision refers to any act of Parliament, so it has a broad scope and is likely to capture a wide array of offences, without differentiating at all between the objective seriousness of each offence.
This new provision is contrary to the fundamental principle set out in section 718.1 of the Criminal Code, proportionality in sentencing. We therefore recommend enhancing the existing provisions in the Criminal Code so as not to create a new system of prosecution for hate crimes, alongside the current system.
Now, let's turn to part 3 of the bill, the amendments to the Canadian Human Rights Act. We welcome the fact that the bill restores section 13 of the act to address the communication of hate speech. The proposed new wording is more specific and better circumscribed, helping to balance the rights and freedoms protected by the charter. The Quebec bar association also agrees with the “hate speech” definition laid out in the bill, given that it respects the teachings of the Supreme Court in Whatcott, a case that centred on human rights.
Lastly, we question the punitive quality being introduced into the Canadian Human Rights Act under the bill. The Supreme Court wrote in Taylor and Blencoe that the purpose of the act is not to punish wrongdoing, but to prevent discrimination, and that the aim of a human rights system must be conciliation, not punishment. Under the bill in its current form, the act is being amended to include a punitive measure, something that would distort the purpose of a human rights system.
We recommend that the penalty instead be paid to the victim. Alternatively, if there is no identified or identifiable victim, we recommend that the penalty be paid to a human rights organization or a group targeted by the communication that constituted the discriminatory practice.
Like subsection 53(3) of the Canadian Human Rights Act, the bill could include the possibility of ordering the person responsible for the discriminatory practice to pay special compensation to the victim if the person was engaging or engaged in the discriminatory practice willfully or recklessly. We have provided additional comments in our brief.
We would now be glad to answer the committee's questions.
Thank you.
:
Good afternoon. Thank you for your time today.
I am Jane. Perhaps, like I do, many of you hold the same title—the title of parent. I am the mother of a one-spirited young girl who was sexually abused and, on account of that abuse, has also become a victim of sexual exploitation. Maybe you, as a parent, can relate to my child's story, which unfortunately has also become our family's lived experience.
In my few allotted minutes, I'd like to provide some insight and describe a few details of the horrific sexual abuse my little girl endured and continues to endure daily. My daughter was just a toddler when, one day, fate stamped itself upon her. She was just a young child who had no choice but to solely entrust her life to the hands of an adult who was supposed to protect her from harm, teach her right from wrong and love her in such a manner that cultivated and would enforce, in the future, what a healthy relationship is supposed to look like.
In her preschool to kindergarten years, she was groomed to believe that sex or sexual actions between children and grown-ups was completely acceptable and normal. Some days, instead of watching cartoons, she spent her time with a presumably trusted adult who normalized child pornographic material. This normalization took place by subjecting her to possibly hundreds of child exploitation videos repeatedly, at any opportune moment. With the help of various child sexual abuse materials, she was conveniently raped by her abuser over and over again. Based on evidence collected by law enforcement, it could possibly be determined that she was raped and sexually assaulted on a daily basis. She was between the ages of three and six years old, and raped in such a way that she was brainwashed to believe it was a fun game. Many times, she was bribed with candy as her reward for performance. Her performance included but was not limited to oral sex, vaginal intercourse, and edged into inserting various items into her anus.
The perpetrator was her biological father. This man also trafficked his own daughter by having her virtually participate in scripted, sexually explicit activities with one or more adults within the dark walls of the online world. When my child's abuser was caught, he admitted that the abuse had spiralled out of control. He had become desensitized to raping my daughter for his own sexual satisfaction. He admitted to law enforcement that he was always hungry for more.
My child's understanding of what happened to her is greater than she wishes she could remember. The extent of the damage done to her in the moment, and that continues into the present, is incalculable. My little girl has countless flashbacks that haunt her while she sleeps. Often, she is anxious, fearful and scared. Sadly, the abuse that happened to her is now hyperactively present on the dark web, known to be one of the top-downloaded series of child sexual abuse material circulating on the dark web. Child predators have saved and shared images and videos of her tiny body lying naked and contorted in provocative ways. Her privates are no longer private. Her vagina is on display for the world to view. Her smile, laughter and innocence have all been taken from her. How she is portrayed in those pictures and videos is not how she wants to be perceived.
Those who take it upon themselves to download, view, save and share my child's inappropriate pedophiliac merchandise take part in continually harming her. Perpetrators have blatantly premeditated their motives and have activated those actions against her will. Individuals who possess her child sexual abuse products should no doubt be held accountable and take full moral responsibility for their own contribution to the continued exploitation crisis that my child and many others continue to endure.
Because of these perpetrators' antics, my child has secluded herself from enjoyment of a fulfilling life. This is the only coping mechanism she believes she has to protect herself. She tries to hide herself by not leaving the house. If she does, she fears she may be recognized. She feels that she is damaged beyond repair. She wants the memories to vanish. Until the Internet has mandatory regulations and rules that aim to protect her and other victims against child sexual abuse material, the images will continue to exist. The evolution of technology has been her nemesis. As of now, she cannot escape the abuse, nor can the abuse escape her.
I will fight for my child and be an advocate for the protection she deserves. This should not be a debate. My child has suffered in silence for far too long. She should not be ashamed, nor should she feel guilty about the personal attacks that take place on the uncontrolled Internet. Allow her and others to find their dignity again.
Moving forward, we have a choice to be the change and shape the future of all children. My little girl is not solely a victim of hands-on offending. She is revictimized every single time her child sexual abuse material surfs throughout the dark web. What kind of person doesn't want to protect the future of our children or grandchildren? I will say it again with urgency: We need a culture of lawfulness that strongly enforces Internet regulation. The unregulated Internet has damaged my child and countless children across the nation.
:
Well, thank you so much.
Ms. McNicoll, I have a question for you. This is a study of Bill , and I know, from your testimony and from what I read about what your organization has said, that you're very well versed on the topic. Before I ask you a question on it, I want your opinion of Bill , which I just mentioned in this motion. It's a private member's bill by a Conservative member of Parliament, our colleague , that deals with some of the same issues and subject matter as Bill C-63.
I'll just give a very high-level overview of it. Bill will modernize the existing crime of sexual harassment to deal with online harassment. It will require social media platforms to increase safeguards for children around bullying, sexual violence, self-harm, and sexual abuse material—as witness Jane mentioned—and it will update Canada's existing laws around distribution of non-consensual, artificially produced images—deepfakes, in other words. Bill does not address any of those topics, so there's a big gap there that we think C-412 will fill. Here's my question: Would you agree that these are important subject matters that should be discussed on a priority basis, an opportunity that is presented by Bill C-412?
:
Thank you for your question.
Obviously, to make informed comments on the bill, I would need to know the details. Luckily, I've had a chance to read the bill, so I am familiar with some of the issues it deals with. I agree that the types of harmful content you mentioned are problems.
Bill may give rise to fewer freedom of expression issues, but it does raise concerns around privacy and the right to equality.
I'll explain what I mean. Much of the bill refers to parental controls in relation to the content available to minors. When it comes to parental controls, it's important to understand that existing methods to verify a user's age are flawed, and raise concerns related to privacy and the right to equality.
Certainly, the methods have to be effective. They can't discriminate against people on the basis of ethnicity, and the personal inormation collected for age verification purposes has to be handled appropriately. In other words, the collection of the data must adhere to the privacy principles that exist in Canada.
In your testimony, you referenced parts 1, 2 and 3. You're happy that parts 2 and 3 have now been removed. I know that your organization recommended that, so the listened to your recommendation. Congratulations.
My question is on whether part 4 of Bill could be separated out completely and dealt with separately to accelerate the protection it would afford to people who are sexually harassed. Part 4, just for your reference, amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service. Part 1, on the other hand, creates a regulatory body. It will be time-consuming and expensive to get there. Part 4, if it's separated out completely, could be dealt with very quickly.
What's your opinion on that?
I want to thank all the witnesses for joining us today.
Jane—I'll refer to you that way—thank you very much for sharing your horrific story with us. We're here talking about a bill presented by the government, Bill , and particularly part 1. My question for you is one that you've somewhat addressed. To quote you, “The unregulated Internet has damaged my child”, and it continues to do so on an ongoing basis.
An important part of part 1 of the bill, which is the part we're focusing on, is the so-called takedown provisions that would be required on the Internet. Criminal Code provisions are one thing, but there's a requirement, as you alluded to, about the importance of having the ability to instantly address a problem when it arises and have something removed from the Internet ASAP.
Can you expand on the importance of that, in your view? Also, if this is not passed into legislation now, can you explain what impact that might have on your family and others?
:
I believe you're referring to the duty laid out in sections 67 and 68 and subsequent sections of the proposed act in part 1 of the bill. Those sections address specific types of very harmful content, in particular, content that sexually victimizes a child. The provisions also set out a deadline for the platform operator to assess the flagged content and remove it if the content turns out to be real.
The CCLA does not have a problem with the provisions. As we see it, the problem has more to do with the much more general duties laid out for operators. I'm talking mainly about sections 55 to 59 of the proposed act.
Section 55 sets out a general duty to take reasonable measures to prevent users from being exposed to harmful content. When I say harmful content, I'm talking about the seven types listed in the bill. Unfortunately, without adequate parameters, an operator might be tempted to take a very cautious approach in fulfilling the duty, an approach that could unreasonably limit freedom of expression in Canada.
For example, proactively searching and deleting content amounts to state surveillance by proxy. The CCLA considers that to be a problematic practice, but it isn't prohibited in the bill as it currently stands. An operator could also decide to take down content without even reviewing it, which we also consider problematic.
Frankly, we are not saying that freedom of expression is an absolute right in Canada that should not be subject to reasonable limits. However, the duties imposed on operators need to be circumscribed in a way that makes clear to operators not only what their duties are, but also the fact they must act reasonably to fulfill those duties in accordance with freedom of expression principles.
Thank you to the witnesses for being here. I want to say how much my heart goes out to Witness 1. I am grateful to her for sharing her story about the abuse endured by her daughter—her baby, even. The fact that this kind of thing can still happen is disturbing, and I think that we, as lawmakers, must do everything we can to prevent it from happening.
I also want to thank Ms. Claveau for being here.
I'd like to revisit two things you said when you were explaining the Barreau du Québec's position.
First, you said that the Barreau supported Quebec's call for the removal of the religious exemption set out in two provisions of section 319 of the Criminal Code. In discussions about the religious exemption, it is commonly argued that the provision has hardly been used, so there may be no point in eliminating it.
Is it possible that, even though the courts have not addressed the provision often, it is considered when a decision is being made on whether to bring a proceeding in a case?
Emotion of an intense and extreme nature is being used as an objective test.
It is important, however, to distinguish between the test set out in Keegstra and Mugesera, which were criminal law decisions, and the test set out in Whatcott and other human rights decisions. The decision was made to rework the test in Whatcott.
Basically, the test selected was the one established in the decisions I just mentioned. It was simply adjusted to clarify that the emotion must be characterized as would reasonably be expected. That means the emotion, not of the person at the source of the content in question, but of the person on the receiving end of the content.
I think the definitions set out by the Supreme Court for the term “hatred” are very clear. It's about taking those criteria and incorporating them into the Criminal Code.
As I see it, the current provisions in Bill set a lower standard than the test established in Mugesera.
I think it's important to be very careful because when you get into freedom of expression and freedom of religion, people have rights. The Supreme Court considered the issue very seriously and thoroughly, examining hundreds of pages of material before making the findings it did and rendering its decision.
:
The message you're sending us is very clear: that we need to take action. I think all members of the committee understand that. I can't thank you enough for coming forward today to share that with us.
I have questions for the other witnesses.
[Translation]
Now I'm going to turn to Ms. Bussières McNicoll and Ms. Claveau.
Part 1 of Bill establishes fines. Operators are liable to “a fine of not more than 3% of the person’s gross global revenue or $10 million, whichever is greater”.
It says that, on summary conviction, an operator is liable to “a fine of not more than 2% of the person’s gross global revenue or $5 million, whichever is greater”.
Individuals are liable to “a fine of not more than $50,000”. That seems pretty low given the repercussions of the offence in question, such as the impact on Witness 1, her daughter and family.
It's one thing to put a legislative framework in place, but it's another to establish penalties in order to end the scourge. It's clear that the case involving Witness 1's daughter calls for significant penalties.
What do you think of the penalties I just mentioned and the approach outlined in the bill?
I would like Ms. Bussières McNicoll to answer first.
:
Thank you for your question.
I would say, at the outset, that it's important to put into context the fact that the bill establishes seven types of harmful content. When considering penalties for individuals, lawmakers mustn't go too far by unduly punishing individuals in connection with certain types of content.
As far as the penalties for operators are concerned, I will let those who wish to do so comment on the size of the fines. However, I will say that it's important to keep something in mind: the higher the penalty is, the clearer the duty needs to be. Otherwise, operators will want to fulfill the vague duties imposed on them at all costs, possibly at the expense of users' freedom of expression.
It comes back to the situation I described earlier. Taking an excessively cautious approach in relation to flagged content and responding in a very swift and disproportionate way to assess that content could be harmful to online free speech.
:
Thank you for your question.
As you no doubt saw when reading our brief, we didn't comment specifically on part 1 of the bill.
Generally speaking, though, when it comes to these types of penalties and fines, especially an administrative monetary penalty regime, a whole process goes into determining the amounts of the fines, whether they apply to individuals or businesses. In many cases, it's a percentage of the business's revenues. A lot of factors are taken into account.
I won't comment on whether the approach is consistent or appropriate, but I will say that a lot of work has to go into establishing an administrative monetary penalty regime.
I encourage you to compare this regime with others that have already been adopted to see whether there are any similarities. You can also look to the teachings of the Supreme Court in decisions relating to the validity of such regimes.
Jane, what you've done today is very courageous. People don't know this is happening. They have no idea. I believe that halfway to beating this is.... Obviously, we have to do legislation and implement change, but people don't believe that parents traffic their children. People don't believe that children are used as sexual tools online daily, as you've testified here today. They don't know because they don't want to believe that humanity is that horrific.
I want to tell you thank you. We can't fix anything if we don't acknowledge what has actually happened. Thank you for that.
There are a couple of things I want to point out. The big thing we're trying to sort out here is the best recommendation so that we have implementation as soon as possible to protect children online. We've had witness testimony on sextortion. Children are taking their lives.
Jane, you're traumatized for the rest of your life. Your child is traumatized for the rest of her life. The impact on the community is significant.
Right now, the way that Bill is written, it is calling on—and I'll use the language from it—a digital safety commission of Canada, the digital safety office of Canada, the position of a digital safety ombudsperson, and a mandate for the commission and ombudsperson to follow. This is another aspect of not having action instantly.
To my Liberal colleague's point of an immediate takedown of the image, you're not going to have that with Bill . You need a regulated body to be put in place, which could take years.
What we're saying in Bill is that we would implement this instantly through the actual social media platform. A judge would have the capacity instantly to name the person who has the image, release their name and charge them. The duty of care then falls on the social media platforms to be implementing age verification—which we know they can do through algorithms.
The issue we're having with Bill is the same issue we've seen in other regulating bodies. The action doesn't come with the intention.
The example I will give you is the ombudsperson we have in this country for victims. They've seen an increase of 477%. Nothing happens after the victims go to the ombudsman, right? There's no action tied to it.
My question for you, Jane, is this. Would you like to see a bill like Bill that implements instant action on the social media platforms and enables judges to ensure that those names are released so that there is actually a takedown and not just an intention of takedown?
:
Thank you very much, Mr. Chair.
Jane, I'd like to echo what my colleagues have said. The great courage you've shown to come forward is absolutely incredible, and your determination to protect not only your own daughter but also other children is commendable. Thank you.
We heard from Carol Todd at the last meeting, who expressed concern that victims were being asked these technical legal questions, and I don't want to get into these.
However, because you talked about your experience with police and the current process, and there was no help available, I was wondering if you could talk about what it would mean to have a digital safety commission that could act for victims. I know some people will dismiss it as a bureaucracy, but I was wondering if you could speak to that, to have a voice, if that would be beneficial.
:
Thank you for the question.
First, there are indeed specific legal obligations, as suggested in part 1 of the bill. These obligations would make it possible to quickly ensure the removal of particularly harmful content. I'm thinking here of content that sexualizes children or perpetuates the victimization of survivors and intimate content shared without consent. In this sense, I consider it a significant step forward.
That said, there are still privacy issues in part 1 of this bill. As a result, one of our recommendations seeks to clarify that the obligations of operators and the obligations of the Digital Safety Commission of Canada and other regulators must respect the privacy of users and operators.
Let me explain.
Of course, we know that operators have access to users' personal information as part of their activities. We also know that certain federal legislation already regulates the collection, retention, protection and sharing of confidential and private information. The failure to specifically refer to these obligations can lead to confusion for operators.
I would like to ask Ms. Claveau or the other Barreau du Québec representatives about life imprisonment.
I gather that the Barreau du Québec considers the provision somewhat broad when it sets out this penalty for a wide range of offences. I also share this view and find it worrying.
However, if we want to convey the seriousness and gravity of the type of offence involved, is there any way to increase the penalty?
I understand that you're proposing to review sentences one by one. Couldn't we include a provision whereby, in certain set cases, the maximum or minimum penalty would be double the prescribed penalty?
Could this be a good option to look into, or do we really need to proceed offence by offence and set out specific penalties?
:
Thank you for the invitation to appear before you.
My name is Emily Laidlaw. I'm a Canada research chair and associate professor of law at the University of Calgary.
At the last committee meeting, and earlier today, you heard horrific stories, bringing home the harms this legislation aims to address. With my time, I'd like to focus on the legal structure for achieving these goals, why law is needed, why part 1 of Bill is structured the way it is and what amendments are needed.
My area of expertise is technology law and human rights: specifically, platform regulation, freedom of expression and privacy. I have spent my career examining how best to write these kinds of laws. I will make three points with my time.
First, why do we need a law in the first place? When the Internet was commercialized in the 1990s, tech companies became powerful arbiters of expression. They set the rules and how to enforce them. Their power has only grown over time.
Social media are essentially data and advertising businesses and, now, AI businesses. How they deliver that to consumers and how they design their products and services can directly cause harm. For example, how they design their algorithms makes decisions about our mental health, pushing content encouraging self-harm and hate. They use persuasive techniques to nudge addictive behaviour, such as with endless scrolling rewards and constant notifications.
Thus far in Canada, we have largely relied on corporate self-governance. The EU, U.K. and U.S. passed legislation decades ago. Many are on their second-generation versions of these laws, and a network of regulators is working together to create global coherence.
Meanwhile, Canada has never passed a comprehensive law in this space. The law that does apply is piecemeal, mainly a bit of defamation, privacy and competition law, circling important dimensions of the problem, but not dealing with it directly.
Where does that leave us in Canada? Part 1 of Bill is the product of years of consultation, to which I contributed. In my view, with amendments, it is the best legal structure to address online harms.
That brings me to my second point. This legislation impacts the right to freedom of expression.
Our expert panel spent considerable time on how best to protect freedom of expression, and the graduated approach we recommended is reflected in this bill.
There are three levels to this graduated approach.
First, the greatest interference with freedom of expression is content removal, and the bill requests that for only two types of content that are the worst of the worst, the stuff that we all agree should be taken down: child sexual abuse material and non-consensual disclosure of intimate images, both of which are crimes.
At the next level is a special duty to protect children, recognizing their unique vulnerability. The duty requires that social media integrate safety by design into their products and services.
The third, the foundation, is that social media have a duty to act responsibly. This does not require content removal. It requires that social media mitigate the risks of exposure to harmful content.
In my view, the bill aligns with global standards because it's focused on systemic risks of harm and takes a risk mitigation approach, coupled with transparency obligations.
Third, I am not here to advocate that the bill is passed as is. The bill is not perfect. It should be carefully studied and amended.
There are also other parts of the bill that don't necessarily need to amended but entail hard choices that should be debated. To be debated are the scope of the bill; what harms are included and not; what social media are included based on size or type; the regulatory structure; a new versus existing body and what powers it should have; and what should be included in the legislation versus left to be developed later in codes of practice or regulations.
There are, however, amendments that I do think are crucial. I'll close with this list. I have three.
One, the duty to act responsibly should also include the duty to have due regard for fundamental rights and how companies mitigate risk. Otherwise, social media might implement sloppy solutions in the name of safety that disproportionately impact rights. This type of provision is in the EU and U.K. legislation.
Two, the duty to act responsibly and duty to protect children should clearly cover algorithmic accountability and transparency. I think it's loosely covered in the current bill, but it should be fleshed out and made explicit.
Three, the child protection section should be reframed as the best interests of the child. In addition, the definitions of harmful content for children should be amended. There are two main amendments here. One is that content that induces a child to harm themselves should be narrowly scoped so that children exploring their identity are not accidentally captured and, two, addictive design features should be added to the list.
Thank you for your time. I look forward to our discussion.
:
Good evening, parliamentarians, honourable members of the House of Commons Standing Committee on Justice and Human Rights.
Thank you for this opportunity to speak as part of the pre‑study on Bill , which concerns online hate speech.
My name is Étienne‑Alexis Boucher. I'm the president of Droits collectifs Québec. I was supposed to be joined by François Côté, senior legal officer at Droits collectifs Québec. Unfortunately, he can't join us on account of the brand of his microphone.
Droits collectifs Québec is a non‑profit organization governed by an independent board of directors. It identifies as an agent of social transformation and operates throughout Quebec. Our mission is to help advocate for collective rights in Quebec, particularly with regard to people's language and constitutional rights. Our approach is non‑partisan. The organization's work encompasses many areas of action, including public education, social mobilization, political representation and legal action.
I've just given a brief overview of the organization. I would now like to focus on the Quebec consensus, which covers two aspects. We've already addressed the first, and this was touched on by the witnesses in the first panel earlier. We heard particularly poignant evidence regarding the mother of a young woman whose intimate images were shared.
While Ottawa refused to budge on this issue, Quebec ended up taking the lead. It became a pioneer in the field. The National Assembly adopted measures that fall under the Criminal Code. Unfortunately, Quebec doesn't have any power over the Criminal Code. At least, that's the current situation. Using its constitutional prerogatives, Quebec adopted measures concerning the sharing of intimate content without consent. In other words, since the federal government wasn't addressing the issue, we responded to the Quebec consensus with this initiative.
Another example of the Quebec consensus is the National Assembly's unanimous adoption of the request to repeal subsections 319(3)(b) and 319(3.1)(b) of the Criminal Code. These subsections state that “no person shall be convicted of an offence” of wilfully promoting hatred against an identifiable group “if, in good faith, the person expressed or attempted to establish by an argument an opinion on a religious subject or an opinion based on a belief in a religious text.”
This exception in the name of religious freedom has no place in a modern state such as Canada. We know that the Constitution of 1867 states that power in Canada is granted by divine right. Even the head of state can't be chosen democratically by the citizens of Canada, but by God. However, it's now the 21st century. I don't think that freedom of religion should rank higher than freedom of conscience, for example, or freedom of political opinion, when everyone acknowledges that certain limits are valid. For example, teachers may not, in the course of their duties, express opinions based on the political status of Quebec or Canada. These limits to a basic freedom are perfectly justifiable.
However, we find it completely unacceptable to make something normally considered a crime into a non‑crime in the name of freedom of religion. As a result, we're ultimately encouraging the parliamentarians to heed the call of Quebec's justice minister. Once again, the vast majority of Quebeckers are in agreement. The justice minister expressed a widely‑held consensus that hate speech based on religion is simply unacceptable.
There have been some concrete examples. We've seen the abuses and effects resulting from this exception up until now. People, in a fully public manner, in front of hundreds of thousands of individuals—if we count the people who viewed the images widely available on social media—could see the call to genocide made in the name of a religion.
Unfortunately, this call was not able to be criminally prosecuted, probably due to the exception. Again, we think this is unacceptable. This position is held by the Quebec government and by organizations such as the Rassemblement pour la laïcité, of which I am the vice-president. Ours is an umbrella organization for dozens of organizations representing thousands of people.
:
Good evening. I'm Matt Hatfield, the executive director of OpenMedia, a non-partisan, grassroots community of over 250,000 people in Canada working for an open, affordable and surveillance-free Internet.
I'm joining you from the unceded territory of the Stó:lō, Tsleil-Waututh, Squamish and Musqueam nations.
It's a pretty remarkable thing to be here today to talk about the online harms bill. When Canadians first saw what this bill might look like as a white paper back in 2021, we didn't much like what we saw. OpenMedia called it a blueprint for making Canada's Internet one of the most censored and surveilled in the democratic world, and we were far from alone in being concerned.
For once, our government listened. The rush to legislate stopped. National consultations were organized across the country on how to get regulation right with a wide range of stakeholders and experts on harms and speech. The resulting part 1 of Bill is an enormous, night-and-day improvement. Simple-minded punitive approaches that would have done more harm than good are gone, and nuances and distinctions made throughout show real sophistication about how the Internet works and how different harms should be managed. Packaging part 1—the online harms act itself—with changes to the Criminal Code and Human Rights Act proposed alongside it badly obscured that good work. That's why, alongside our peers, we called for these parts to be separated and why we warmly welcome the government's decision to separate those parts out.
I'll focus here on part 1 and part 4.
OpenMedia has said for years that Canadians do not have to sacrifice our fundamental freedoms to make very meaningful improvements to our online safety. The refocused Bill is the proof. Instead of trying to solve everything unpleasant on the Internet at once, Bill C-63 focuses on seven types of already-illegal content in Canada, and treats the worst and most easily identifiable content—child abuse material and adult material shared without consent—most severely. That's the right call. Instead of criminalizing platforms for the ugly actions of a small number of users, which would predictably make them wildly overcorrect to surveil and censor all of us, Bill C-63 asks them to write their own assessments of the risks posed by these seven types of content and document how they try to mitigate that risk. That's the right call again. It will put the vast engineering talent of platforms to work for the Canadian public, thinking creatively about ways to reduce these specific illegal harms. It will also make them explain what they are doing as they do it, so we can assess whether it makes sense and correct it if it does not.
However, I want to be very clear: It is not the time to pass Bill and call it quits. It's just the opposite. Because the parts that are now being separated raise so many concerns, there has not been nearly enough attention paid to refining part 1. I know you'll be hearing from a range of legal and policy experts about concerns they have with some of the part 1 wording and recommended fixes. I hope you will listen very carefully to all of them and pass on many of the fixes they suggest to you.
This is not the time to be a rubber stamp. The new digital safety commission is granted extraordinary power to review, guide and make binding decisions on how platforms moderate the public expression of Canadians in the online spaces we use the most. That's appropriate if, and only if, you make sure they carefully consider and minimize impacts on our freedom of expression and privacy. It isn't good enough for the commission to think about our rights and its explicit decisions. A badly designed platform safety plan could reduce an online harm but have a wildly disproportionate impact on our privacy or freedom of expression. You need to make sure platforms and the regulator make written assessments of the impact of their plans on our rights and ensure that any impact is small and proportionate to the harm mitigated. Bill 's protections of private, encrypted communication, and against platforms surveilling their users, need to be strengthened further and made airtight.
OpenMedia has a unique role in this discussion because we are both a rights-defending community that will always stand up for our fundamental freedoms and a community of consumer advocates who fight for common-sense regulation that empowers us and improves our daily lives. If you do your work at this committee, you can made Bill a win on both these counts. Since 2021, members of our community have sent nearly 22,000 messages to government asking you to get online harms right. Taking your time to study Bill C-63 carefully and make appropriate fixes before passing it would fulfill years of our activism and make our Internet a better, healthier place for many years to come.
Thank you, and I look forward to your questions.
[English]
My first question is for Mr. Hatfield.
Thank you for your presentation.
I'm curious, given the very clear concerns you've expressed relating to parts 2 and 3 of Bill , why you're not more concerned about some sections of part 1, particularly those related to the digital safety commission, the digital safety office and the digital safety ombudsperson, which would lay some of the bureaucratic groundwork that makes parts 2 and 3 possible.
Are you concerned about those sections of part 1? Would you care to give us some specific concerns you have related to part 1, which we're focused on today?
:
Yes, thanks so much for that opportunity.
What I think is critical—and this builds on what Matt was just talking about—is that there is always a risk of overcorrection if the focus is purely on harms. That's why it's important that one of the key harms can be to freedom of expression, and to privacy in particular, so it's important for the companies to be filing digital safety plans that explain how they make decisions bespoke to their services that balance out the scope of harms but also think through a way of doing it that's most protective of privacy and freedom of expression. The digital safety commission would have a duty to consider that in what they do, but it needs to also be on the company.
I think, concerning the child protection measures, that the best interest of the child is protected under international law. I think that is the blueprint here. Detailing specifically what it is about child protection that we're looking for when we talk about safety by design is incredibly important.
Of course, there's algorithmic accountability. I can discuss this further with you, but I'm conscious of time.
It's great to see you in the chair, Mr. Chair.
I would like to say hello to Étienne-Alexis Boucher, who hails from my region.
I'm going to direct my questions to Ms. Laidlaw and Mr. Hatfield.
Ms. Laidlaw, it was said earlier that the bill was the result of several years of consultation. You've been part of this process.
On what basis was the list of seven categories of harmful content drawn up?
The bill sets out an obligation for platforms to act responsibly. What does that mean in concrete terms? I imagine it implies the obligation to identify the risk of harm and mitigate its effects.
I'd like to hear your comments on that.
Having had many discussions with other governments, the duty to act responsibly is generally the same as the duty of care in the U.K. or the due diligence and risk management obligations in Europe broadly. It's all about this due diligence approach of companies at a systemic level.
The duty to act responsibly came out of the Commission on Democratic Expression, and it's really practical. Their recommendation was we shouldn't use the language “duty of care”. That's the language from tort law. It might be confusing if this goes to court. We want this to be a stand-alone statutory duty where the duty is just set out in the legislation, and “duty to act responsibly” captured that better.
When it comes to the harms that are included, I think that is a point for debate. In discussions with colleagues, one that could be added to the list is the crime of identity fraud. That is a major issue, and I think it would be appropriate to include that.
It's notable what's not on the list. A point I have had multiple discussions about is the inclusion of mis- and disinformation, which generally fall into the category of “lawful but awful”. That is included in the EU legislation. The decision was not to include it in Canada because of what I would take as the problematic risks to freedom of expression. That is, we would be biting off more than should be taken on by a regulator.
One last point is we have to think in terms of what a regulator can take on practically. We know this will cost money, so some of this is a discussion on what we should include practically that are high-risk issues that a regulator can investigate and make a difference on now.
I'll leave it there.
Thank you.
I will now take the floor for six minutes.
I'd like to thank all the witnesses for being with us today.
Mr. Boucher, Mr. Côté, Mr. Hatfield and Ms. Laidlaw, your participation is invaluable.
Ms. Laidlaw, on the subject of hate, Bill provides that “A person may, with the Attorney General's consent, lay an information before a provincial court judge if the person fears on reasonable grounds that another person will commit...”.
Are you not concerned that this wording is a little too vague and that it could lead to abuse?
:
Unfortunately, neither of you is able to testify because of this reason.
In my opinion, this was a mistake made by the House of Commons. From what I understand, those headsets were sent to you last spring, but you didn't get the new headsets.
Please know that if you would like to come back to this committee, we can invite you at a later date.
That said, if you wish, you can simply send us your comments or answers to the questions put to you in writing. Those answers must be sent to all committee members without delay, and they will be taken into consideration.
Again, if you wish to be invited back, I will personally ask that we invite you at a future meeting. At that time, we'll send you the right headsets.
I apologize on behalf of our committee. If you are able to send us your observations and comments in writing, I would appreciate it.
Please know that we take this issue very seriously. It's important to know that interpreters have suffered damage to their hearing. I didn't know it could happen. I learned that during the pandemic. Since then, we've been using videoconferencing a lot, and we need to protect our interpreters, whom we really need to do our job.
Do let me know if you want to testify again. I will make sure that you are invited at a later date.
Thank you.
I had one minute of speaking time left, and I'm going to allow myself to ask my question right away.
Ms. Laidlaw, when it comes to maximum sentences, we're talking about life sentences for certain aggravated offences related to hate speech. For example, someone who committed a hate-related offence could be subject to a life sentence.
In your opinion, is this the right sentence here? Should we review that provision?
:
Thank you very much, Mr. Chair.
I completely agree with what you said about the interpreters. You made an important and fair call. The interpreters must be properly protected at all times. They really are one of the foundations of our Parliament. It's a shame to have to interrupt a witness's testimony, but you did the right thing and acted responsibly.
[English]
I want to come to Ms. Laidlaw.
Thank you very much for your presentation today. You talked about algorithm issues as well. I want to come back to you on that issue.
We have before Parliament, as you know, Bill , which is an act respecting transparency for online algorithms. To what extent do you think—in addition to part 1 of the bill—we need to look at legislation or perhaps need to incorporate portions of that bill to ensure that we actually have a situation where Canadians can see the transparency around algorithms, and in that way, as well, we can ensure greater safety?