Skip to main content

JUST Committee Meeting

Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.

For an advanced search, use Publication Search tool.

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

Previous day publication Next day publication
Skip to Document Navigation Skip to Document Content






House of Commons Emblem

Standing Committee on Justice and Human Rights


NUMBER 126 
l
1st SESSION 
l
44th PARLIAMENT 

EVIDENCE

Monday, December 9, 2024

[Recorded by Electronic Apparatus]

(1600)

[Translation]

    I call the meeting to order.

[English]

     Welcome to meeting 126 of the House of Commons Standing Committee on Justice and Human Rights.
    At this point in the meeting, I'd like to propose the adoption of the Bill C-63 prestudy budget in the amount of $23,250. I understand that budget was previously distributed to all members.
    Can I see a show of hands in support?
     (Motion agreed to)
    The Vice-Chair (Mr. Larry Brock): That's unanimous. It's adopted.
    Thank you.
    Pursuant to Standing Order 108(2) and the motion adopted on December 2, 2024, the committee is meeting in public to begin its study on the subject matter of Bill C-63, an act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act, and an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other acts.
    At this juncture, I'd like to welcome our witnesses for the first hour.
    Appearing by video conference, on behalf of the Quebec Bar, we have Catherine Claveau, president; Nicolas Le Grand Alary, secretariat of the order and legal affairs; and Michel Marchand, member, criminal law expert group.
    Appearing in person is Madame Anaïs Bussières McNicoll, director, fundamental freedoms program, Canadian Civil Liberties Association.
    Witnesses and members, please wait until I recognize you by name before speaking. For those participating by video conference, please ensure that you have selected the language of your choice for simultaneous interpretation, which is on the bottom left of your screen, and please mute yourself when you are not speaking.
     I remind all members to take the floor only after being recognized by the chair.
    Without any further delay, the floor is yours. Each witness has five minutes.
    Who would like to start?
    Perhaps Madame Bussières McNicoll can start.
(1605)

[Translation]

    The Canadian Civil Liberties Association, or CCLA, appreciates the opportunity to share its view on Bill C‑63.
    The CCLA is an independent, national non-governmental organization founded in 1964 with a mandate to defend and foster the civil liberties, human rights and democratic freedoms of all people across Canada. We work to achieve strong protections for freedom of expression, privacy and principles of fundamental justice. That work is central to our mandate.
    The CCLA recognizes the importance of legislative measures to protect some of the most vulnerable members of society from especially harmful forms of online speech. In that sense, the CCLA recognizes that some of the duties established under part 1 of the bill for operators of a regulated service are welcome. However, the current iteration of the online harms act also sets out broader duties that need to be clarified and limited appropriately. Otherwise, they will give rise to problems in relation to freedom of expression.
    For example, the general duty set out in subsection 55(1) of the proposed act requires operators to implement measures that are adequate to mitigate the risk that users of the service will be exposed to harmful content on the service. The scope of the provision is too vague. In the absence of proper parameters, operators will likely try to fulfill the unspecific duty as efficiently and economically as possible, potentially at the expense of users' freedom of expression. For instance, operators could proactively monitor content, which at this point is not prohibited under the new act, or they could take down content as determined by non-transparent algorithms.
    The general duty imposed on operators to implement tools and processes to flag harmful content, as per section 59 of the proposed act, has similar flaws, which would likely jeopardize freedom of expression as well. As it is written, the online harms act would allow operators to remove various types of flagged content, without giving the user who posted the content an opportunity to present their view. In fact, as written, the proposed act would even implicitly allow operators to remove various types of flagged content without first having to determine whether the content was indeed harmful.
    The first three recommendations in our written submission to the committee address these concerns. We recommend that operators, in their efforts to fulfill their statutory duties, be prohibited from engaging in mass surveillance and unduly limiting users' freedom of expression. We also recommend that the newly created body in the bill, the digital safety commission of Canada, be required to check annually that operators are fulfilling their duties as they relate to users' rights.
    The CCLA applauds the justice minister's recently announced plan to remove parts 2 and 3 from this bill. This addresses a joint request made months ago by the CCLA and a number of civil society groups to ensure that the committee's study of part 1 was not overshadowed by controversial changes to the Criminal Code and the Canadian Human Rights Act. The CCLA is of the view that Parliament should not pass parts 2 and 3 of the bill.
    With respect to the proposed Criminal Code amendments, the new hate-motivated offence would irrationally increase the maximum sentence associated with any offence in Canada to life imprisonment. This excessive judicial discretion paves the way for disproportionate sentencing and an increase in plea bargaining by innocent and vulnerable defendants. It would also hinder free speech in Canada.
    The CCLA also objects to the new “fear of hate propaganda offence or hate crime” provision. Criminal law should be a means of holding individuals accountable for what they have done, not for what others fear they might do. Allowing a judge to limit the freedom and expression of an individual who is not even suspected or accused of having committed a crime, let alone convicted of one, unreasonably and unjustifiably infringes on several rights protected by the Canadian Charter of Rights and Freedoms.
    Lastly, I will turn to part 3 of the bill, the amendments being proposed to the Canadian Human Rights Act. The CCLA is of the view that the proposed amendments are neither an appropriate nor effective way to address the problem of hate speech in our modern society. The amendments would result in an onslaught of complaints to human rights organizations, which are already chronically under-resourced.
    Thank you.
    I would be pleased to answer your questions.
    Thank you very much.
    You may go ahead with your opening remarks, Ms. Claveau. You have five minutes.
    Good afternoon, members of the committee.
    My name is Catherine Claveau, and I am the president of the Barreau du Québec. Joining me from the Barreau du Québec are Michel Marchand, member of the criminal law expert group; and Nicolas Le Grand Alary, lawyer, secretariat of the order and legal affairs. Thank you for giving the Quebec bar association the opportunity to comment on Bill C‑63.
    Given our experience in criminal law and human rights, our remarks will focus solely on parts 2 and 3 of the bill, the proposed amendments to the Criminal Code and the Canadian Human Rights Act.
    Let's start with part 2, the Criminal Code amendments. With the significant rise in hate crimes, most of which are based on race and ethnic origin, it is paramount that the bill provide the courts with the tools to respond effectively, while ensuring they adhere to the principles of fundamental justice and Canada's constitutional requirements. That is why the Barreau du Québec supports the Quebec justice minister's call for lawmakers to remove the religious exemption in the Criminal Code for hate propaganda.
    The Quebec bar association considers it essential to codify a definition of hate. On one hand, this would encourage people to report incidents while helping communities clearly understand what is prohibited. On the other, it would give all actors in the justice system, police, in particular, a clear framework within which to operate.
    However, we have concerns about the definition being proposed in the bill for the term “hatred”, which is based on the decision in Whatcott. In the case, the Supreme Court of Canada ruled on the constitutionality of a human rights provision prohibiting hate publications. The Quebec bar association considers the key decision in criminal matters to be the 1990 decision in Keegstra. The Supreme Court relied on the analysis in Keegstra in Mugesera in 2005.
    In both decisions, the Supreme Court interpreted hatred in view of the Criminal Code provisions and found that “‘hatred’ connotes emotion of an intense and extreme nature that is clearly associated with vilification and detestation.” The provision could be subject to a constitutional challenge, and since the burden of proof in criminal law is not the same as it is in civil law and since individuals accused of a crime are guaranteed certain rights under the Canadian Charter of Rights and Freedoms, we recommend that the bill apply the definition relied on in those decisions.
    In addition, the bill makes it a hate crime to commit an offence under the Criminal Code or any other act of Parliament if the commission of the offence is motivated by hatred based on certain factors. Someone guilty of the new offence would be liable to imprisonment for life. The new provision refers to any act of Parliament, so it has a broad scope and is likely to capture a wide array of offences, without differentiating at all between the objective seriousness of each offence.
    This new provision is contrary to the fundamental principle set out in section 718.1 of the Criminal Code, proportionality in sentencing. We therefore recommend enhancing the existing provisions in the Criminal Code so as not to create a new system of prosecution for hate crimes, alongside the current system.
    Now, let's turn to part 3 of the bill, the amendments to the Canadian Human Rights Act. We welcome the fact that the bill restores section 13 of the act to address the communication of hate speech. The proposed new wording is more specific and better circumscribed, helping to balance the rights and freedoms protected by the charter. The Quebec bar association also agrees with the “hate speech” definition laid out in the bill, given that it respects the teachings of the Supreme Court in Whatcott, a case that centred on human rights.
    Lastly, we question the punitive quality being introduced into the Canadian Human Rights Act under the bill. The Supreme Court wrote in Taylor and Blencoe that the purpose of the act is not to punish wrongdoing, but to prevent discrimination, and that the aim of a human rights system must be conciliation, not punishment. Under the bill in its current form, the act is being amended to include a punitive measure, something that would distort the purpose of a human rights system.
(1610)
    We recommend that the penalty instead be paid to the victim. Alternatively, if there is no identified or identifiable victim, we recommend that the penalty be paid to a human rights organization or a group targeted by the communication that constituted the discriminatory practice.
    Like subsection 53(3) of the Canadian Human Rights Act, the bill could include the possibility of ordering the person responsible for the discriminatory practice to pay special compensation to the victim if the person was engaging or engaged in the discriminatory practice willfully or recklessly. We have provided additional comments in our brief.
    We would now be glad to answer the committee's questions.
    Thank you.
(1615)
    Thank you, Ms. Claveau.

[English]

     Next we'll go to Monsieur Le Grand Alary for five minutes.

[Translation]

    Mr. Le Grand Alary is also from the Barreau du Québec, so he doesn't have any opening remarks.
    In that case, we will now hear from Mr. Marchand for five minutes.
    Mr. Marchand is also part of the Barreau du Québec team.

[English]

    We'll go to Witness Number One for five minutes, please.
     Good afternoon. Thank you for your time today.
     I am Jane. Perhaps, like I do, many of you hold the same title—the title of parent. I am the mother of a one-spirited young girl who was sexually abused and, on account of that abuse, has also become a victim of sexual exploitation. Maybe you, as a parent, can relate to my child's story, which unfortunately has also become our family's lived experience.
     In my few allotted minutes, I'd like to provide some insight and describe a few details of the horrific sexual abuse my little girl endured and continues to endure daily. My daughter was just a toddler when, one day, fate stamped itself upon her. She was just a young child who had no choice but to solely entrust her life to the hands of an adult who was supposed to protect her from harm, teach her right from wrong and love her in such a manner that cultivated and would enforce, in the future, what a healthy relationship is supposed to look like.
     In her preschool to kindergarten years, she was groomed to believe that sex or sexual actions between children and grown-ups was completely acceptable and normal. Some days, instead of watching cartoons, she spent her time with a presumably trusted adult who normalized child pornographic material. This normalization took place by subjecting her to possibly hundreds of child exploitation videos repeatedly, at any opportune moment. With the help of various child sexual abuse materials, she was conveniently raped by her abuser over and over again. Based on evidence collected by law enforcement, it could possibly be determined that she was raped and sexually assaulted on a daily basis. She was between the ages of three and six years old, and raped in such a way that she was brainwashed to believe it was a fun game. Many times, she was bribed with candy as her reward for performance. Her performance included but was not limited to oral sex, vaginal intercourse, and edged into inserting various items into her anus.
    The perpetrator was her biological father. This man also trafficked his own daughter by having her virtually participate in scripted, sexually explicit activities with one or more adults within the dark walls of the online world. When my child's abuser was caught, he admitted that the abuse had spiralled out of control. He had become desensitized to raping my daughter for his own sexual satisfaction. He admitted to law enforcement that he was always hungry for more.
     My child's understanding of what happened to her is greater than she wishes she could remember. The extent of the damage done to her in the moment, and that continues into the present, is incalculable. My little girl has countless flashbacks that haunt her while she sleeps. Often, she is anxious, fearful and scared. Sadly, the abuse that happened to her is now hyperactively present on the dark web, known to be one of the top-downloaded series of child sexual abuse material circulating on the dark web. Child predators have saved and shared images and videos of her tiny body lying naked and contorted in provocative ways. Her privates are no longer private. Her vagina is on display for the world to view. Her smile, laughter and innocence have all been taken from her. How she is portrayed in those pictures and videos is not how she wants to be perceived.
    Those who take it upon themselves to download, view, save and share my child's inappropriate pedophiliac merchandise take part in continually harming her. Perpetrators have blatantly premeditated their motives and have activated those actions against her will. Individuals who possess her child sexual abuse products should no doubt be held accountable and take full moral responsibility for their own contribution to the continued exploitation crisis that my child and many others continue to endure.
    Because of these perpetrators' antics, my child has secluded herself from enjoyment of a fulfilling life. This is the only coping mechanism she believes she has to protect herself. She tries to hide herself by not leaving the house. If she does, she fears she may be recognized. She feels that she is damaged beyond repair. She wants the memories to vanish. Until the Internet has mandatory regulations and rules that aim to protect her and other victims against child sexual abuse material, the images will continue to exist. The evolution of technology has been her nemesis. As of now, she cannot escape the abuse, nor can the abuse escape her.
     I will fight for my child and be an advocate for the protection she deserves. This should not be a debate. My child has suffered in silence for far too long. She should not be ashamed, nor should she feel guilty about the personal attacks that take place on the uncontrolled Internet. Allow her and others to find their dignity again.
(1620)
    Moving forward, we have a choice to be the change and shape the future of all children. My little girl is not solely a victim of hands-on offending. She is revictimized every single time her child sexual abuse material surfs throughout the dark web. What kind of person doesn't want to protect the future of our children or grandchildren? I will say it again with urgency: We need a culture of lawfulness that strongly enforces Internet regulation. The unregulated Internet has damaged my child and countless children across the nation.
     Thank you, Witness Number One.
    I now recognize Mr. Van Popta. Sir, you have six minutes.
    Thank you, Mr. Chairman.
    Thank you to all the witnesses.
    Jane, thank you for your very courageous statement.
    Before I get into questions for the witnesses, I would like to read a motion into the record, a notice of motion, which reads:
Given that:

a need exists to quickly pass provisions to keep Canadians safe while protecting free speech;

members of several political parties have expressed a need to split Bill C-63 in part due to reservations about the bill's provisions involving restrictions on speech;

the committee is currently doing a pre-study on Bill C-63;

Bill C-63 proposes giving full responsibility to develop regulations for online platforms to a regulator that hasn't been formed yet, and does so with too much ambiguity on what regulations this body will propose or administer;

based on witness testimony, a better approach to keep Canadians safe online while protecting their civil liberties would be to legislate a defined list of responsibilities that online platforms have to undertake to keep Canadians safe;

That the committee proceed to concurrently pre-study Bill C-412, promotion of safety in the digital age act, along with its pre-study of Bill C-63, so as to examine other legislative options to protect Canadians online, which could be quickly advanced by consensus without the controversial elements of C-63.
    That's the motion, Mr. Chair.
    Mr. Van Popta, you are only giving notice at this point, you're not moving it. Is that correct?
    I'm just giving notice of it; that's right.
     You have about four and a half minutes left.
    Well, thank you so much.
    Ms. McNicoll, I have a question for you. This is a study of Bill C-63, and I know, from your testimony and from what I read about what your organization has said, that you're very well versed on the topic. Before I ask you a question on it, I want your opinion of Bill C-412, which I just mentioned in this motion. It's a private member's bill by a Conservative member of Parliament, our colleague Michelle Rempel Garner, that deals with some of the same issues and subject matter as Bill C-63.
    I'll just give a very high-level overview of it. Bill C-412 will modernize the existing crime of sexual harassment to deal with online harassment. It will require social media platforms to increase safeguards for children around bullying, sexual violence, self-harm, and sexual abuse material—as witness Jane mentioned—and it will update Canada's existing laws around distribution of non-consensual, artificially produced images—deepfakes, in other words. Bill C-63 does not address any of those topics, so there's a big gap there that we think C-412 will fill. Here's my question: Would you agree that these are important subject matters that should be discussed on a priority basis, an opportunity that is presented by Bill C-412?

[Translation]

    Thank you for your question.
    Obviously, to make informed comments on the bill, I would need to know the details. Luckily, I've had a chance to read the bill, so I am familiar with some of the issues it deals with. I agree that the types of harmful content you mentioned are problems.
    Bill C‑412may give rise to fewer freedom of expression issues, but it does raise concerns around privacy and the right to equality.
    I'll explain what I mean. Much of the bill refers to parental controls in relation to the content available to minors. When it comes to parental controls, it's important to understand that existing methods to verify a user's age are flawed, and raise concerns related to privacy and the right to equality.
    Certainly, the methods have to be effective. They can't discriminate against people on the basis of ethnicity, and the personal inormation collected for age verification purposes has to be handled appropriately. In other words, the collection of the data must adhere to the privacy principles that exist in Canada.
(1625)

[English]

     Understood.
    I'm happy that you have some familiarity with Bill C-412. In your opinion, could Bill C-412 and Bill C-63 be studied at the same time?

[Translation]

    That is outside my area of expertise.

[English]

     Fair enough.
    In your testimony, you referenced parts 1, 2 and 3. You're happy that parts 2 and 3 have now been removed. I know that your organization recommended that, so the minister listened to your recommendation. Congratulations.
    My question is on whether part 4 of Bill C-63 could be separated out completely and dealt with separately to accelerate the protection it would afford to people who are sexually harassed. Part 4, just for your reference, amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service. Part 1, on the other hand, creates a regulatory body. It will be time-consuming and expensive to get there. Part 4, if it's separated out completely, could be dealt with very quickly.
    What's your opinion on that?

[Translation]

    I would say that part 4 is certainly less problematic.
    However, part 1 is very important. I know that civil society groups spent a lo of time examining that part of the bill and that many of them are prepared to take a position on it, including ours.

[English]

     You have nine seconds left.
    I can't do anything good in nine seconds.
     Thank you, Madam McNicoll, for your testimony.
     Thank you, Mr. Van Popta.
    Mr. Mendicino, you have six minutes, sir.
    I think it's Ms. Brière.

[Translation]

    We now go to Mrs. Brière for six minutes.

[English]

    Thank you, Mr. Chair.

[Translation]

    Thank you for your remarks, Witness 1.

[English]

     I will have to stop you there, Madame Brière. There's no interpretation coming through. There are some issues with your headset.

[Translation]

    All right, but the sound checks were done, and everything was working.

[English]

     Can we move to somebody else? We don't have time to suspend.
    Will that be you, James?
     Madame Brière, we won't suspend. We'll have the technicians give you a call to sort out your headphone issues.
    In the meantime, I will turn matters over to Mr. Maloney.
    Mr. Maloney, you have roughly five and a half minutes left.
    Thank you, Mr. Chair.
    I want to thank all the witnesses for joining us today.
    Jane—I'll refer to you that way—thank you very much for sharing your horrific story with us. We're here talking about a bill presented by the government, Bill C-63, and particularly part 1. My question for you is one that you've somewhat addressed. To quote you, “The unregulated Internet has damaged my child”, and it continues to do so on an ongoing basis.
    An important part of part 1 of the bill, which is the part we're focusing on, is the so-called takedown provisions that would be required on the Internet. Criminal Code provisions are one thing, but there's a requirement, as you alluded to, about the importance of having the ability to instantly address a problem when it arises and have something removed from the Internet ASAP.
    Can you expand on the importance of that, in your view? Also, if this is not passed into legislation now, can you explain what impact that might have on your family and others?
(1630)
     Implementing the immediate takedown is definitely the biggest part of solving this problem. Without having that in place, there's no end to this.
    Thank you.
    Having lived through this—I take it that included dealing with the police and the authorities—was there any immediate action taken to have the content removed from the online world? If so, please share your experience in that regard.
     My experience is that this has been a very slow and drawn-out process and horrifically painful.
     It's been completely ineffective.
     What do you say to those, then, who are openly critical of these provisions of the bill and who have stated unequivocally that should this bill be passed, they would remove those parts of the legislation and in fact remove the bill altogether?
    This part of the bill is one of the most important parts: Protecting the children. This is the beginning of the end of it.
     Thank you.
     I'm going to move on to you, Ms. McNicoll, because in your opening remarks and in your answers to some of these questions, you spoke about parts 2 and 3, which we're not talking about right now.
    You did say that part 1 is important and you're willing to take a position. What is your position on part 1, particularly in light of what you just heard from Jane?

[Translation]

    I believe you're referring to the duty laid out in sections 67 and 68 and subsequent sections of the proposed act in part 1 of the bill. Those sections address specific types of very harmful content, in particular, content that sexually victimizes a child. The provisions also set out a deadline for the platform operator to assess the flagged content and remove it if the content turns out to be real.
    The CCLA does not have a problem with the provisions. As we see it, the problem has more to do with the much more general duties laid out for operators. I'm talking mainly about sections 55 to 59 of the proposed act.
    Section 55 sets out a general duty to take reasonable measures to prevent users from being exposed to harmful content. When I say harmful content, I'm talking about the seven types listed in the bill. Unfortunately, without adequate parameters, an operator might be tempted to take a very cautious approach in fulfilling the duty, an approach that could unreasonably limit freedom of expression in Canada.
    For example, proactively searching and deleting content amounts to state surveillance by proxy. The CCLA considers that to be a problematic practice, but it isn't prohibited in the bill as it currently stands. An operator could also decide to take down content without even reviewing it, which we also consider problematic.
    Frankly, we are not saying that freedom of expression is an absolute right in Canada that should not be subject to reasonable limits. However, the duties imposed on operators need to be circumscribed in a way that makes clear to operators not only what their duties are, but also the fact they must act reasonably to fulfill those duties in accordance with freedom of expression principles.

[English]

     Thank you.
    Just considering Jane's evidence, it's better to have a takedown provision in the legislation than to leave it to the criminal law, and certainly better than leaving it to the Internet providers themselves. Wouldn't you agree with that?

[Translation]

    I'm not sure whether it is better to have both mechanisms, but I think it's an interesting idea. Again, I, personally, have no issues with the takedown provisions in relation to the highly disturbing and harmful content Witness 1 described. I'm at a loss for words to convey how sympathetic I am to the situation the witness bravely shared.

[English]

     Thank you.
    That is your time, Mr. Maloney.
    Thank you, Mr. Chair.

[Translation]

    You have six minutes, Mr. Fortin. Go ahead.
    Thank you, Mr. Chair.
    Thank you to the witnesses for being here. I want to say how much my heart goes out to Witness 1. I am grateful to her for sharing her story about the abuse endured by her daughter—her baby, even. The fact that this kind of thing can still happen is disturbing, and I think that we, as lawmakers, must do everything we can to prevent it from happening.
    I also want to thank Ms. Claveau for being here.
    I'd like to revisit two things you said when you were explaining the Barreau du Québec's position.
    First, you said that the Barreau supported Quebec's call for the removal of the religious exemption set out in two provisions of section 319 of the Criminal Code. In discussions about the religious exemption, it is commonly argued that the provision has hardly been used, so there may be no point in eliminating it.
    Is it possible that, even though the courts have not addressed the provision often, it is considered when a decision is being made on whether to bring a proceeding in a case?
(1635)
    That would be our assumption, as well.
    I think it is also important to recognize that hate propaganda and hate speech are unfortunately on the rise, and that all such speech based on religion will probably increase as well.
    In our view, one of the ways to address that scourge is to get rid of the religious exemption.
    The other thing I wanted to discuss with you was the definition of the word “hatred”. It is very difficult, impossible even, to clearly define the concept in a manner suitable to everyone and all situations. It is always a sensitive subject.
    The Supreme Court's teachings on the matter are obviously valuable. The definition of hatred you recommend is the one in Keegstra, which states that the term “connotes emotion of an intense and extreme nature that is clearly associated with vilification and detestation.”
    Certainly, “vilification” and “detestation” are perhaps easier to define. Judges would eventually have to determine whether an accused was driven by emotion of an intense and extreme nature—not any old emotion. I don't want to make things up, but it's likely that the decisions reached by those judges will be shockingly conflicting. I actually don't have another definition to recommend. I defer to the wisdom of the Supreme Court justices.
    In your view, how careful should we be in trying to define something as personal and subjective as emotion of an intense and extreme nature driving a person's behaviour?
    Thank you for your question.
    I'm going to ask my colleagues to answer. They are more familiar than I am with interpreting Supreme Court decisions.
    I think the member is right about that point.
    On the whole, the Barreau submits that it is difficult to identify a definition of the word “hatred”. It is important to ensure that the criteria referred to in the Supreme Court's decisions are met. That is why the definition chosen must leave as little room as possible for court challenges or conflicting decisions, as you mentioned.
    I will ask Mr. Marchand to elaborate on definitions and the more technical considerations.
    Emotion of an intense and extreme nature is being used as an objective test.
    It is important, however, to distinguish between the test set out in Keegstra and Mugesera, which were criminal law decisions, and the test set out in Whatcott and other human rights decisions. The decision was made to rework the test in Whatcott.
    Basically, the test selected was the one established in the decisions I just mentioned. It was simply adjusted to clarify that the emotion must be characterized as would reasonably be expected. That means the emotion, not of the person at the source of the content in question, but of the person on the receiving end of the content.
    I think the definitions set out by the Supreme Court for the term “hatred” are very clear. It's about taking those criteria and incorporating them into the Criminal Code.
    As I see it, the current provisions in Bill C‑63 set a lower standard than the test established in Mugesera.
    I think it's important to be very careful because when you get into freedom of expression and freedom of religion, people have rights. The Supreme Court considered the issue very seriously and thoroughly, examining hundreds of pages of material before making the findings it did and rendering its decision.
(1640)

[English]

     Thank you, Mr. Marchand.
     That's your time, Monsieur Fortin.
    Mr. Julian, you have six minutes.
     Thank you, Mr. Chair.
    I want to say to you, Jane, that I've been in Parliament for many years, and this is one of the most moving presentations I've heard by a witness. I know it would have been extremely difficult for you to come forward to this committee. We can't thank you enough for your brutal honesty on what your daughter has been through. It is something that I think will remain in our minds for some time to come. Thank you for sharing that. All of us hope that your daughter is getting the care and supports she needs.
    The fact that these images are continuing to circulate obviously shows the importance of moving forward as quickly as possible with the provisions of the bill in part 1 that deal with criminal sexual exploitation of children.
    At this point, is it individuals, companies...? Who is continuing to perpetuate these terrible images of crime?
    It's anyone who is accessing the dark walls of the web—child predators, people who are interested in that kind of stuff. They are the ones who are trading these images and uploading them on a regular basis—pretty much daily.
    They are doing it with impunity.
     Unfortunately, yes. It's not only the images. There's regular talk about my child.
     I can't imagine, as a parent, what you're going through and what she's going through.
     It's a very scary situation.
     The message you're sending us is very clear: that we need to take action. I think all members of the committee understand that. I can't thank you enough for coming forward today to share that with us.
    I have questions for the other witnesses.

[Translation]

    Now I'm going to turn to Ms. Bussières McNicoll and Ms. Claveau.
    Part 1 of Bill C‑63 establishes fines. Operators are liable to “a fine of not more than 3% of the person’s gross global revenue or $10 million, whichever is greater”.
    It says that, on summary conviction, an operator is liable to “a fine of not more than 2% of the person’s gross global revenue or $5 million, whichever is greater”.
    Individuals are liable to “a fine of not more than $50,000”. That seems pretty low given the repercussions of the offence in question, such as the impact on Witness 1, her daughter and family.
    It's one thing to put a legislative framework in place, but it's another to establish penalties in order to end the scourge. It's clear that the case involving Witness 1's daughter calls for significant penalties.
    What do you think of the penalties I just mentioned and the approach outlined in the bill?
    I would like Ms. Bussières McNicoll to answer first.
(1645)
    Thank you for your question.
    I would say, at the outset, that it's important to put into context the fact that the bill establishes seven types of harmful content. When considering penalties for individuals, lawmakers mustn't go too far by unduly punishing individuals in connection with certain types of content.
    As far as the penalties for operators are concerned, I will let those who wish to do so comment on the size of the fines. However, I will say that it's important to keep something in mind: the higher the penalty is, the clearer the duty needs to be. Otherwise, operators will want to fulfill the vague duties imposed on them at all costs, possibly at the expense of users' freedom of expression.
    It comes back to the situation I described earlier. Taking an excessively cautious approach in relation to flagged content and responding in a very swift and disproportionate way to assess that content could be harmful to online free speech.
    Your concern has to do with the definitions proposed in the bill and the direction taken. Thank you for clarifying that.
    Ms. Claveau, I have the same question for you about the structure of the bill and part 1 as it relates to penalties.
    I think everyone agrees that it's necessary. What do you think of the approach taken in part 1 of the bill?
    I will let Mr. Grand Alary answer that.
    Thank you for your question.
    As you no doubt saw when reading our brief, we didn't comment specifically on part 1 of the bill.
    Generally speaking, though, when it comes to these types of penalties and fines, especially an administrative monetary penalty regime, a whole process goes into determining the amounts of the fines, whether they apply to individuals or businesses. In many cases, it's a percentage of the business's revenues. A lot of factors are taken into account.
    I won't comment on whether the approach is consistent or appropriate, but I will say that a lot of work has to go into establishing an administrative monetary penalty regime.
    I encourage you to compare this regime with others that have already been adopted to see whether there are any similarities. You can also look to the teachings of the Supreme Court in decisions relating to the validity of such regimes.

[English]

     Thank you. That is your time, Mr. Julian.
    That completes our first round.
    We are now moving into the second round. This will be the final round for the first panel. It will be for 15 minutes with five minutes, five minutes, two and a half minutes and two and a half minutes.
    We're starting with you, Ms. Ferreri.
    You have five minutes.
    Thank you to the witnesses for your testimony today.
    Witness 1, do I have permission to call you by the first name that you used? Is that okay?
(1650)
    Yes.
    Thank you.
    Jane, what you've done today is very courageous. People don't know this is happening. They have no idea. I believe that halfway to beating this is.... Obviously, we have to do legislation and implement change, but people don't believe that parents traffic their children. People don't believe that children are used as sexual tools online daily, as you've testified here today. They don't know because they don't want to believe that humanity is that horrific.
    I want to tell you thank you. We can't fix anything if we don't acknowledge what has actually happened. Thank you for that.
    There are a couple of things I want to point out. The big thing we're trying to sort out here is the best recommendation so that we have implementation as soon as possible to protect children online. We've had witness testimony on sextortion. Children are taking their lives.
    Jane, you're traumatized for the rest of your life. Your child is traumatized for the rest of her life. The impact on the community is significant.
    Right now, the way that Bill C-63 is written, it is calling on—and I'll use the language from it—a digital safety commission of Canada, the digital safety office of Canada, the position of a digital safety ombudsperson, and a mandate for the commission and ombudsperson to follow. This is another aspect of not having action instantly.
    To my Liberal colleague's point of an immediate takedown of the image, you're not going to have that with Bill C-63. You need a regulated body to be put in place, which could take years.
    What we're saying in Bill C-412 is that we would implement this instantly through the actual social media platform. A judge would have the capacity instantly to name the person who has the image, release their name and charge them. The duty of care then falls on the social media platforms to be implementing age verification—which we know they can do through algorithms.
     The issue we're having with Bill C-63 is the same issue we've seen in other regulating bodies. The action doesn't come with the intention.
    The example I will give you is the ombudsperson we have in this country for victims. They've seen an increase of 477%. Nothing happens after the victims go to the ombudsman, right? There's no action tied to it.
    My question for you, Jane, is this. Would you like to see a bill like Bill C-412 that implements instant action on the social media platforms and enables judges to ensure that those names are released so that there is actually a takedown and not just an intention of takedown?
     First, what I'm going to say is that I'm not familiar with the bill you just spoke about.
    My concern is what's in Bill C-63. I see that adding protection and moving forward for my child and the other children.
     I appreciate that you don't know what that bill is, so that's totally fair and I'm happy to share it with you.
    I can tell you that with Bill C-63 there still is this concern of its being years down the road. What I'm saying is that we all want the same thing. We want protection of children today, but if you implement a regulating body, and you don't have duty of care to the social media platforms, then it's not instant, because the regulating body then has to have a meeting, with a meeting, and so on.
    Do you see what I'm saying? It's not direct to the person. Does that make sense?
    I am trusting those who are in charge of Bill C-63 with what they're doing for the protection of all children in Canada.
     Okay. I appreciate that. I think, obviously, I can't stress enough that we absolutely want the protection of children.
    Again, I would put this forward to you. If there were an option between going directly to the social media platform...? I guess I will use you as an example. Right now, why are the images of your child not removed?
     It's because it's not mandated. Nobody has to remove them; they're not told that they have to. There are no consequences.
     Exactly. With Bill C-63, you would still have to go through a person, a regulating body, so let's say it's an ombudsman. They would then have to have a meeting with the regulating body. Then they would have to go to the social media platform.
    What we're saying is that instead of having to go in-between, you would get to go right to a judge; and the judge would say, okay, this is the person—because there's a duty of care for the social media platform to remove that image instantly.
     That is your time, Ms. Ferreri.
    Thank you.
     I hear what you're saying, but I'm trusting the process. As a parent, I'm trusting the process.
     Thank you, Witness 1.
     Moving on to Mr. Bittle, you have five minutes.
     Thank you very much, Mr. Chair.
    Jane, I'd like to echo what my colleagues have said. The great courage you've shown to come forward is absolutely incredible, and your determination to protect not only your own daughter but also other children is commendable. Thank you.
    We heard from Carol Todd at the last meeting, who expressed concern that victims were being asked these technical legal questions, and I don't want to get into these.
    However, because you talked about your experience with police and the current process, and there was no help available, I was wondering if you could talk about what it would mean to have a digital safety commission that could act for victims. I know some people will dismiss it as a bureaucracy, but I was wondering if you could speak to that, to have a voice, if that would be beneficial.
(1655)
     I'm sorry. Were you directing your question towards me?
     I was. Thank you.
     Yes, you're correct. There really wasn't much help in the very beginning.
     I didn't really hear your full question. I didn't realize you were directing it to me. I'm sorry.
     That's okay. I can rephrase it, because the alternative that's being suggested is that victims are required to take it to a court, and take it to a judge, which is, I think, well meaning, but can also take time.
    I was wondering if you could speak about, in your mind, having someone like a digital safety commissioner act on your behalf versus your own requirement of having to take that on your own to a judge or to a court, and how you would see that.
     Yes, I'm totally all for that. I like to focus on the platforms themselves. Those people should be responsible for what content they're sharing. If they had some responsibility, then they wouldn't be allowed to continue to exploit my child.
     You're absolutely right.
    Again, thank you for bringing this forward and speaking up, because I can tell you—and I think Mr. Julian could agree with me—that dealing with large tech companies has not been easy. They've fought regulation along the way, but there are consequences and victims, and there's a requirement for government to act. I think there's some disagreement around this table on what that looks like. I think there is unanimity in acting to protect our kids.
     I'll turn to Madam McNicoll.
    You spoke about protection of privacy versus protection of freedom of speech. I was wondering if you could comment on part 1 of the legislation and its protection of privacy for the victims of these images and the challenge of managing freedom of expression versus the protection of privacy and the rights of the individuals who are exploited by the Internet.

[Translation]

    Thank you for the question.
    First, there are indeed specific legal obligations, as suggested in part 1 of the bill. These obligations would make it possible to quickly ensure the removal of particularly harmful content. I'm thinking here of content that sexualizes children or perpetuates the victimization of survivors and intimate content shared without consent. In this sense, I consider it a significant step forward.
    That said, there are still privacy issues in part 1 of this bill. As a result, one of our recommendations seeks to clarify that the obligations of operators and the obligations of the Digital Safety Commission of Canada and other regulators must respect the privacy of users and operators.
    Let me explain.
    Of course, we know that operators have access to users' personal information as part of their activities. We also know that certain federal legislation already regulates the collection, retention, protection and sharing of confidential and private information. The failure to specifically refer to these obligations can lead to confusion for operators.

[English]

     Thank you. That's your time, Mr. Bittle.
    We'll move on to Monsieur Fortin.

[Translation]

    You have the floor for two and a half minutes.
    Thank you, Mr. Chair.
    I would like to ask Ms. Claveau or the other Barreau du Québec representatives about life imprisonment.
    I gather that the Barreau du Québec considers the provision somewhat broad when it sets out this penalty for a wide range of offences. I also share this view and find it worrying.
    However, if we want to convey the seriousness and gravity of the type of offence involved, is there any way to increase the penalty?
    I understand that you're proposing to review sentences one by one. Couldn't we include a provision whereby, in certain set cases, the maximum or minimum penalty would be double the prescribed penalty?
    Could this be a good option to look into, or do we really need to proceed offence by offence and set out specific penalties?
(1700)
    Thank you for the question.
    I'll refer you to page 8 of our brief. You'll see that we're proposing this exact solution. We're proposing to increase the current penalties and even to increase them by more than the suggested double. This obviously depends on the offence.
    My colleague, Mr. Le Grand Alary, will elaborate on this.
    We drew from a current provision concerning intimate partner violence. Where the offence is motivated by hate, the maximum sentence would be increased as follows: two‑year sentences would become five‑year sentences, five‑year sentences would become 10‑year sentences, 10‑year sentences would become 14‑year sentences and 14‑year sentences would become life sentences.
    You must also understand that the calculations may not always amount to the double. There may be nuances, but this is in line with the logic of the Criminal Code—
    Mr. Le Grand Alary, sorry to interrupt. I don't mean to be rude, but I have only a few seconds left.
    I gather that the one‑size‑fits‑all solution of simply doubling the penalties isn't a good idea.
    Is that right?
    That's right. When an offence carries a sentence of 14 years or more, the defendant is entitled to a preliminary inquiry. When it carries a sentence of five years, the defendant is entitled to a jury trial.
    The Criminal Code already contains a variety of sentencing scales. Merely doubling the sentences may not be the solution. It might be a matter of reviewing them in light of the scales already established for various types of offences.
    Thank you, Mr. Le Grand Alary.
    I think that my time is up, Mr. Chair.

[English]

     That's the time. Thank you.
    Mr. Julian, you have two and a half minutes.

[Translation]

     Thank you, Mr. Chair.
    I would like to turn again to Ms. Claveau and Ms. Bussières McNicoll.
    Ms. Claveau, you spoke about section 13 of the Canadian Human Rights Act and the Canadian Human Rights Commission.
    The minister has already expressed an interest in removing this clause from the bill. However, reinstating section 13 in the Canadian Human Rights Act may hamper the Canadian Human Rights Commission's ability to implement the major process required to handle complaints, given that it already lacks resources to do its job.
    Ms. Claveau, are you concerned about this situation?
    Yes, we're concerned about the situation.
    Along the same lines, if we look at the bill and the current situation, the Digital Safety Commission of Canada set out in the bill won't necessarily have all the resources needed to do its job either.
    Are you also concerned that a situation similar to the one experienced by Witness 1 could happen again? In that case, the issue wasn't addressed promptly. Swift action should have been taken and tough measures put in place to address the victimization of children.
    The Barreau du Québec believes that you generally must make sure that you have all the resources needed to implement the legislation. Otherwise, it won't work. It's really important to make sure.
    Thank you.
    I would like to ask Ms. Bussières McNicoll the same question regarding the need to provide the necessary resources for the Canadian Human Rights Commission and the newly created Digital Safety Commission of Canada.
    Thank you for the question.
    You're quite right about the current human rights tribunals. We elaborate on this topic in the brief that we sent you. Their lack of resources and case backlogs are well documented. It's hard to see how adding the hate speech file to their workload without allocating significant resources will help them. From a strictly pragmatic perspective, this raises an issue.
    We also have other concerns about asking these tribunals, which have highly specific and significant expertise in equality rights, to regulate hate speech and freedom of expression in Canada.
    I have no particular comments regarding the second part of your question. If a new entity is set up to do this work or if new regulators are created, they must receive proper funding.
(1705)

[English]

     Thank you. That is your time, Mr. Julian.
    Thank you to all of the witnesses in the first round. We appreciate your time and attention.
    We're now going to suspend for a few minutes.
(1705)

(1710)

[Translation]

     I call the committee back to order.

[English]

    We have, for the second panel, Emily Laidlaw from the University of Calgary; Étienne-Alexis Boucher from Collective Rights Quebec; and Matthew Hatfield from Open Media,.
    Members, all these witnesses appearing by video conference have been tested. They all qualify.
     That being said, I'd like to turn matters over to the witnesses for their opening statements.
    We'll start with you, Ms. Laidlaw, for five minutes.
     My name is Emily Laidlaw. I'm a Canada research chair and associate professor of law at the University of Calgary.
     At the last committee meeting, and earlier today, you heard horrific stories, bringing home the harms this legislation aims to address. With my time, I'd like to focus on the legal structure for achieving these goals, why law is needed, why part 1 of Bill C-63 is structured the way it is and what amendments are needed.
    My area of expertise is technology law and human rights: specifically, platform regulation, freedom of expression and privacy. I have spent my career examining how best to write these kinds of laws. I will make three points with my time.
    First, why do we need a law in the first place? When the Internet was commercialized in the 1990s, tech companies became powerful arbiters of expression. They set the rules and how to enforce them. Their power has only grown over time.
     Social media are essentially data and advertising businesses and, now, AI businesses. How they deliver that to consumers and how they design their products and services can directly cause harm. For example, how they design their algorithms makes decisions about our mental health, pushing content encouraging self-harm and hate. They use persuasive techniques to nudge addictive behaviour, such as with endless scrolling rewards and constant notifications.
     Thus far in Canada, we have largely relied on corporate self-governance. The EU, U.K. and U.S. passed legislation decades ago. Many are on their second-generation versions of these laws, and a network of regulators is working together to create global coherence.
     Meanwhile, Canada has never passed a comprehensive law in this space. The law that does apply is piecemeal, mainly a bit of defamation, privacy and competition law, circling important dimensions of the problem, but not dealing with it directly.
     Where does that leave us in Canada? Part 1 of Bill C-63 is the product of years of consultation, to which I contributed. In my view, with amendments, it is the best legal structure to address online harms.
    That brings me to my second point. This legislation impacts the right to freedom of expression.
     Our expert panel spent considerable time on how best to protect freedom of expression, and the graduated approach we recommended is reflected in this bill.
     There are three levels to this graduated approach.
     First, the greatest interference with freedom of expression is content removal, and the bill requests that for only two types of content that are the worst of the worst, the stuff that we all agree should be taken down: child sexual abuse material and non-consensual disclosure of intimate images, both of which are crimes.
    At the next level is a special duty to protect children, recognizing their unique vulnerability. The duty requires that social media integrate safety by design into their products and services.
    The third, the foundation, is that social media have a duty to act responsibly. This does not require content removal. It requires that social media mitigate the risks of exposure to harmful content.
    In my view, the bill aligns with global standards because it's focused on systemic risks of harm and takes a risk mitigation approach, coupled with transparency obligations.
    Third, I am not here to advocate that the bill is passed as is. The bill is not perfect. It should be carefully studied and amended.
     There are also other parts of the bill that don't necessarily need to amended but entail hard choices that should be debated. To be debated are the scope of the bill; what harms are included and not; what social media are included based on size or type; the regulatory structure; a new versus existing body and what powers it should have; and what should be included in the legislation versus left to be developed later in codes of practice or regulations.
     There are, however, amendments that I do think are crucial. I'll close with this list. I have three.
     One, the duty to act responsibly should also include the duty to have due regard for fundamental rights and how companies mitigate risk. Otherwise, social media might implement sloppy solutions in the name of safety that disproportionately impact rights. This type of provision is in the EU and U.K. legislation.
    Two, the duty to act responsibly and duty to protect children should clearly cover algorithmic accountability and transparency. I think it's loosely covered in the current bill, but it should be fleshed out and made explicit.
    Three, the child protection section should be reframed as the best interests of the child. In addition, the definitions of harmful content for children should be amended. There are two main amendments here. One is that content that induces a child to harm themselves should be narrowly scoped so that children exploring their identity are not accidentally captured and, two, addictive design features should be added to the list.
    Thank you for your time. I look forward to our discussion.
(1715)
     Thank you, Ms. Laidlaw.
    We'll turn now to Mr. Boucher.

[Translation]

    You have the floor for five minutes.
    Good evening, parliamentarians, honourable members of the House of Commons Standing Committee on Justice and Human Rights.
    Thank you for this opportunity to speak as part of the pre‑study on Bill C‑63, which concerns online hate speech.
    My name is Étienne‑Alexis Boucher. I'm the president of Droits collectifs Québec. I was supposed to be joined by François Côté, senior legal officer at Droits collectifs Québec. Unfortunately, he can't join us on account of the brand of his microphone.
    Droits collectifs Québec is a non‑profit organization governed by an independent board of directors. It identifies as an agent of social transformation and operates throughout Quebec. Our mission is to help advocate for collective rights in Quebec, particularly with regard to people's language and constitutional rights. Our approach is non‑partisan. The organization's work encompasses many areas of action, including public education, social mobilization, political representation and legal action.
    I've just given a brief overview of the organization. I would now like to focus on the Quebec consensus, which covers two aspects. We've already addressed the first, and this was touched on by the witnesses in the first panel earlier. We heard particularly poignant evidence regarding the mother of a young woman whose intimate images were shared.
    While Ottawa refused to budge on this issue, Quebec ended up taking the lead. It became a pioneer in the field. The National Assembly adopted measures that fall under the Criminal Code. Unfortunately, Quebec doesn't have any power over the Criminal Code. At least, that's the current situation. Using its constitutional prerogatives, Quebec adopted measures concerning the sharing of intimate content without consent. In other words, since the federal government wasn't addressing the issue, we responded to the Quebec consensus with this initiative.
    Another example of the Quebec consensus is the National Assembly's unanimous adoption of the request to repeal subsections 319(3)(b) and 319(3.1)(b) of the Criminal Code. These subsections state that “no person shall be convicted of an offence” of wilfully promoting hatred against an identifiable group “if, in good faith, the person expressed or attempted to establish by an argument an opinion on a religious subject or an opinion based on a belief in a religious text.”
    This exception in the name of religious freedom has no place in a modern state such as Canada. We know that the Constitution of 1867 states that power in Canada is granted by divine right. Even the head of state can't be chosen democratically by the citizens of Canada, but by God. However, it's now the 21st century. I don't think that freedom of religion should rank higher than freedom of conscience, for example, or freedom of political opinion, when everyone acknowledges that certain limits are valid. For example, teachers may not, in the course of their duties, express opinions based on the political status of Quebec or Canada. These limits to a basic freedom are perfectly justifiable.
    However, we find it completely unacceptable to make something normally considered a crime into a non‑crime in the name of freedom of religion. As a result, we're ultimately encouraging the parliamentarians to heed the call of Quebec's justice minister. Once again, the vast majority of Quebeckers are in agreement. The justice minister expressed a widely‑held consensus that hate speech based on religion is simply unacceptable.
    There have been some concrete examples. We've seen the abuses and effects resulting from this exception up until now. People, in a fully public manner, in front of hundreds of thousands of individuals—if we count the people who viewed the images widely available on social media—could see the call to genocide made in the name of a religion.
(1720)
    Unfortunately, this call was not able to be criminally prosecuted, probably due to the exception. Again, we think this is unacceptable. This position is held by the Quebec government and by organizations such as the Rassemblement pour la laïcité, of which I am the vice-president. Ours is an umbrella organization for dozens of organizations representing thousands of people.
    Thank you, Mr. Boucher.

[English]

     Thank you. You may have additional time if the members decide to give you additional time. However, your time is up.
    Now we're going to move on to Mr. Hatfield.
    You have five minutes, sir.
     Good evening. I'm Matt Hatfield, the executive director of OpenMedia, a non-partisan, grassroots community of over 250,000 people in Canada working for an open, affordable and surveillance-free Internet.
    I'm joining you from the unceded territory of the Stó:lō, Tsleil-Waututh, Squamish and Musqueam nations.
    It's a pretty remarkable thing to be here today to talk about the online harms bill. When Canadians first saw what this bill might look like as a white paper back in 2021, we didn't much like what we saw. OpenMedia called it a blueprint for making Canada's Internet one of the most censored and surveilled in the democratic world, and we were far from alone in being concerned.
    For once, our government listened. The rush to legislate stopped. National consultations were organized across the country on how to get regulation right with a wide range of stakeholders and experts on harms and speech. The resulting part 1 of Bill C-63 is an enormous, night-and-day improvement. Simple-minded punitive approaches that would have done more harm than good are gone, and nuances and distinctions made throughout show real sophistication about how the Internet works and how different harms should be managed. Packaging part 1—the online harms act itself—with changes to the Criminal Code and Human Rights Act proposed alongside it badly obscured that good work. That's why, alongside our peers, we called for these parts to be separated and why we warmly welcome the government's decision to separate those parts out.
    I'll focus here on part 1 and part 4.
    OpenMedia has said for years that Canadians do not have to sacrifice our fundamental freedoms to make very meaningful improvements to our online safety. The refocused Bill C-63 is the proof. Instead of trying to solve everything unpleasant on the Internet at once, Bill C-63 focuses on seven types of already-illegal content in Canada, and treats the worst and most easily identifiable content—child abuse material and adult material shared without consent—most severely. That's the right call. Instead of criminalizing platforms for the ugly actions of a small number of users, which would predictably make them wildly overcorrect to surveil and censor all of us, Bill C-63 asks them to write their own assessments of the risks posed by these seven types of content and document how they try to mitigate that risk. That's the right call again. It will put the vast engineering talent of platforms to work for the Canadian public, thinking creatively about ways to reduce these specific illegal harms. It will also make them explain what they are doing as they do it, so we can assess whether it makes sense and correct it if it does not.
     However, I want to be very clear: It is not the time to pass Bill C-63 and call it quits. It's just the opposite. Because the parts that are now being separated raise so many concerns, there has not been nearly enough attention paid to refining part 1. I know you'll be hearing from a range of legal and policy experts about concerns they have with some of the part 1 wording and recommended fixes. I hope you will listen very carefully to all of them and pass on many of the fixes they suggest to you.
    This is not the time to be a rubber stamp. The new digital safety commission is granted extraordinary power to review, guide and make binding decisions on how platforms moderate the public expression of Canadians in the online spaces we use the most. That's appropriate if, and only if, you make sure they carefully consider and minimize impacts on our freedom of expression and privacy. It isn't good enough for the commission to think about our rights and its explicit decisions. A badly designed platform safety plan could reduce an online harm but have a wildly disproportionate impact on our privacy or freedom of expression. You need to make sure platforms and the regulator make written assessments of the impact of their plans on our rights and ensure that any impact is small and proportionate to the harm mitigated. Bill C-63's protections of private, encrypted communication, and against platforms surveilling their users, need to be strengthened further and made airtight.
    OpenMedia has a unique role in this discussion because we are both a rights-defending community that will always stand up for our fundamental freedoms and a community of consumer advocates who fight for common-sense regulation that empowers us and improves our daily lives. If you do your work at this committee, you can made Bill C-63 a win on both these counts. Since 2021, members of our community have sent nearly 22,000 messages to government asking you to get online harms right. Taking your time to study Bill C-63 carefully and make appropriate fixes before passing it would fulfill years of our activism and make our Internet a better, healthier place for many years to come.
    Thank you, and I look forward to your questions.
(1725)

[Translation]

    You're not seeing things. I'm replacing Mr. Brock, but the process remains the same.
    I'd like to thank the witnesses for their presentations.
    Mr. Jivani, you have the floor for six minutes.
    Thank you, Mr. Chair.

[English]

     My first question is for Mr. Hatfield.
    Thank you for your presentation.
    I'm curious, given the very clear concerns you've expressed relating to parts 2 and 3 of Bill C-63, why you're not more concerned about some sections of part 1, particularly those related to the digital safety commission, the digital safety office and the digital safety ombudsperson, which would lay some of the bureaucratic groundwork that makes parts 2 and 3 possible.
    Are you concerned about those sections of part 1? Would you care to give us some specific concerns you have related to part 1, which we're focused on today?
    I don't think that part 1 does require parts 2 and 3. I think we can fully separate these, and I think the government should fully separate these.
    Regarding the concerns we have with part 1, it's a huge amount of power we're putting into the hands of the regulator. We believe that it is important that Canada have a regulator here in the same way that we have a privacy regulator and we have a competition regulator. Digital safety is a complex and nuanced enough issue that having a source of government expertise helping make good decisions on it is helpful, but that doesn't mean that you should just hand a blank cheque to this regulator. You need to put some really careful assessments and required limits on that regulator before this bill leaves committee.
    Would you care to elaborate on what some of those limitations would be, from your point of view, that would need to be considered?
(1730)
    The single greatest, in our view, is that its decisions need to have mandatory assessments of their impact on freedom of expression and privacy, both the decisions directly made by the commission and also when they approve these safety plans.
    If they approve safety plans submitted by the platforms, the platforms need to write down what they think the privacy and free expression impacts are. The regulator needs to assess and determine that those are proportionate with the opportunity, frankly, for a case to be taken against them, saying that perhaps the plans were not proportionate, so that they make impactful decisions but within bounds.
     Mr. Hatfield, there are other bills on the table for consideration, for example, that would be more focused on updating existing laws and making it easier for the existing criminal justice system that we have to be responsive to victims and to hold platforms more accountable. It sounds like—and correct me if I'm wrong—you prefer the creation of what I would consider to be new bureaucracy as opposed to strengthening the system that we currently have. Why?
     I think that there's a lot to appreciate in Bill C-412. We do think that bill, if this bill does not pass, is worthy of study, but I think that Bill C-63 would accomplish more over a longer period of time for Canadians than Bill C-412. I think that Bill C-412 is narrow, perhaps too narrow a bill. When it comes to the harms that both of them treat, I think having a regulator involved is really beneficial.
    Now, if you look at privacy law, we don't just say, “Here are your privacy laws on paper, and here's a private right of action, go to it. Our privacy is defended.” We found it extraordinarily valuable to have a Privacy Commissioner who can assist Canadians in asserting their privacy rights. Our hope for this digital safety commission is that they will function similarly.
     Mr. Hatfield, you mentioned the long-term effects of part 1. I would put forward that those long-term effects are the very parts 2 and 3 that you're concerned about, which is why I think a lot of Canadians who agree with a lot of the objectives the government has expressed relating to part 1 are concerned about the ripple effects of what that will mean down the road, especially when the current government has already stated its intentions with a part 2, a part 3 and a part 4. They've already said that there will be sequels. If you are concerned about the sequels, maybe the original is worth reconsidering.
     I appreciate your contributions. Thank you.
    How much time do I have?
    You have a minute and a half.
    With the remainder of our time, Ms. Laidlaw, I'd like to come back to you to ask you to elaborate a bit more on the amendments that you referenced in your opening statement. If you'd like to give us a bit more detail and more context, I'd appreciate it.
     Yes, thanks so much for that opportunity.
    What I think is critical—and this builds on what Matt was just talking about—is that there is always a risk of overcorrection if the focus is purely on harms. That's why it's important that one of the key harms can be to freedom of expression, and to privacy in particular, so it's important for the companies to be filing digital safety plans that explain how they make decisions bespoke to their services that balance out the scope of harms but also think through a way of doing it that's most protective of privacy and freedom of expression. The digital safety commission would have a duty to consider that in what they do, but it needs to also be on the company.
    I think, concerning the child protection measures, that the best interest of the child is protected under international law. I think that is the blueprint here. Detailing specifically what it is about child protection that we're looking for when we talk about safety by design is incredibly important.
    Of course, there's algorithmic accountability. I can discuss this further with you, but I'm conscious of time.
     Thank you.

[Translation]

    Mrs. Brière, you have the floor for six minutes.
    Thank you, Mr. Chair.
    It's great to see you in the chair, Mr. Chair.
    I would like to say hello to Étienne-Alexis Boucher, who hails from my region.
    I'm going to direct my questions to Ms. Laidlaw and Mr. Hatfield.
    Ms. Laidlaw, it was said earlier that the bill was the result of several years of consultation. You've been part of this process.
    On what basis was the list of seven categories of harmful content drawn up?
    The bill sets out an obligation for platforms to act responsibly. What does that mean in concrete terms? I imagine it implies the obligation to identify the risk of harm and mitigate its effects.
    I'd like to hear your comments on that.
(1735)

[English]

    Thank you.
    Having had many discussions with other governments, the duty to act responsibly is generally the same as the duty of care in the U.K. or the due diligence and risk management obligations in Europe broadly. It's all about this due diligence approach of companies at a systemic level.
    The duty to act responsibly came out of the Commission on Democratic Expression, and it's really practical. Their recommendation was we shouldn't use the language “duty of care”. That's the language from tort law. It might be confusing if this goes to court. We want this to be a stand-alone statutory duty where the duty is just set out in the legislation, and “duty to act responsibly” captured that better.
    When it comes to the harms that are included, I think that is a point for debate. In discussions with colleagues, one that could be added to the list is the crime of identity fraud. That is a major issue, and I think it would be appropriate to include that.
    It's notable what's not on the list. A point I have had multiple discussions about is the inclusion of mis- and disinformation, which generally fall into the category of “lawful but awful”. That is included in the EU legislation. The decision was not to include it in Canada because of what I would take as the problematic risks to freedom of expression. That is, we would be biting off more than should be taken on by a regulator.
    One last point is we have to think in terms of what a regulator can take on practically. We know this will cost money, so some of this is a discussion on what we should include practically that are high-risk issues that a regulator can investigate and make a difference on now.
    I'll leave it there.
    Thank you.

[Translation]

    Thank you very much.
    You are of the opinion that the list of categories of harmful content should be reviewed.
    Did I understand you correctly?

[English]

     I'm acknowledging that it's a point of debate. There could be some reasonable arguments either way. I am comfortable with proceeding with the list as-is. The only one I would add would potentially be identity fraud, but I would not be moving far away from the list as it is, considering what's in federal jurisdiction to address as well.

[Translation]

    Thank you.
    In one of your publications, you said that this bill covered the basics, but certain amendments could be made.
    In one article, which was published in English, you say: “This bill gets the big things right.”
    Are you still of that opinion?

[English]

    Yes, 100%. The biggest point of debate was how to actually structure a body to address these issues that balances harms and freedom of expression, and this does that. That is because of the years of consultations, and because it will fit in with the other global regulators.

[Translation]

    Since the beginning of the study, a number of parents, particularly mothers, have told us horrible stories about what their children had experienced. Some young people have even committed suicide.
    Do you believe that Bill C‑63will really allow us to achieve the goals as they are set out?

[English]

     Yes. The caveat is that we're never going to rid the Internet of harm, and it's never going to be a perfect piece of legislation. This is about making things better.
    This cannot happen with existing law or with improving existing laws and it cannot happen just through the courts, although the courts are an important process. This requires a regulator that can work with industry, that is more flexible, that can work with impacted community and civil society groups, and that has the power for quick content removal for the worst of the worst.
    This is an ongoing project and, in the long term, I think this will make things better, but certainly not perfect.
(1740)

[Translation]

    You have 10 seconds left, Mrs. Brière.
    Okay.
    Ms. Laidlaw, the platforms already have their own rules, and we know that they sometimes don't follow them.
    Therefore, do you believe that Bill C‑63 will be able to hold them in check?
    Please answer yes or no, Ms. Laidlaw.

[English]

    Yes, it's about minimum standards.

[Translation]

    I will now take the floor for six minutes.
    I'd like to thank all the witnesses for being with us today.
    Mr. Boucher, Mr. Côté, Mr. Hatfield and Ms. Laidlaw, your participation is invaluable.
    Ms. Laidlaw, on the subject of hate, Bill C‑63 provides that “A person may, with the Attorney General's consent, lay an information before a provincial court judge if the person fears on reasonable grounds that another person will commit...”.
    Are you not concerned that this wording is a little too vague and that it could lead to abuse?

[English]

     I do want to emphasize that my support for what should be fast-tracked for study is part 1. I fully support separating the bills. I think that parts 2 and 3 are fundamentally different.
     With that in mind, with regard to the peace bond provisions, I defer significantly to criminal lawyers who are practising because my understanding is that there are so many hoops to jump through to be able to obtain a peace bond that this isn't the easy win that one would think, in going to court and chilling expression.
    That said, the prospect of it can have that chilling effect. That's why I think much broader consultation is needed on that type of provision. I also am not sure if it's already needed. If the fear is of property crime, you can get a peace bond for that as it exists right now. I remain to be convinced that it is a tool that will be worthwhile.

[Translation]

    Thank you, Ms. Laidlaw.
    Mr. Hatfield, I'll ask you the same question. What is your opinion on the possibility of whistleblowing when there are reasonable grounds to fear that an offence will be committed?

[English]

     We're very seriously concerned by parts 2 and 3. If those were still included in the bill, our presentation and our recommendations would be quite different here.
    However, our understanding is that we now have the opportunity to see only part 1 and part 4 pass together. That is what we're potentially supporting.

[Translation]

    Thank you, Mr. Hatfield.
    Mr. Boucher, has Droits collectifs Québec taken a look at this issue?
    Do you have any comments you'd like to share?
    Could you repeat the question?
    I'm referring to part 2 of the bill, which deals with hate. The minister announced that he was going to split the bill in two. I obviously agree with that, since it was the Bloc Québécois that made the request in the first place. We agree that the bill needs to be split in two. However, until this actually comes to pass, we are conducting a prestudy of Bill C‑63in its entirety.
    I'm taking the liberty of asking you this question, even though I, too, think that my question should be asked as part of another study.
    The bill reads as follows: “A person may, with the Attorney General's consent, lay an information before a provincial court judge if the person fears on reasonable grounds that another person will commit...”.
    Does that sound reasonable? Are you not concerned that this could open the door to abuse in terms of whistleblowing?
    I would ask Mr. Côté to answer you.
    I believe Mr. Côté doesn't have the House of Commons headset. The one he is using could be harmful to our interpreters. That is why, unfortunately, we will not be able to get his opinion, even though it would have been very valuable.
(1745)
    I would first like to say that our testimony was not about this very specific issue. However, I agree with the opinion of the experts who testified before this committee. It seems obvious that it would be a good idea.
    Mr. Boucher, are you able to tell us about the definition of the word “hate” as proposed in Bill C‑63?
    You may have heard the comments made by the witnesses who appeared in the first part of the meeting. The Barreau du Québec has expressed its opinion on this definition. We were told about a Supreme Court decision, the name of which escapes me. In that case, a judge looked at that definition.
    I'd like to hear your thoughts on that.
    How should that word be defined and what are the parameters that would make it possible to frame this concept?
    My colleagues from the Barreau du Québec seemed to be saying that, basically, the Supreme Court did a relatively good job of defining what hate speech could be.
    In a way, I think it's—

[English]

     On a point of order, we're not getting any interpretation.

[Translation]

    Just a moment, Mr. Boucher. There's no interpretation.

[English]

     Does he hear the interpreter?

[Translation]

    Mr. Boucher, I'm sorry, but I've just learned that you don't have the right headset either. What I understood is that you and Mr. Côté received headsets last May, but the headsets—
    It's a headset that I purchased.
    Unfortunately, neither of you is able to testify because of this reason.
    In my opinion, this was a mistake made by the House of Commons. From what I understand, those headsets were sent to you last spring, but you didn't get the new headsets.
    Please know that if you would like to come back to this committee, we can invite you at a later date.
    That said, if you wish, you can simply send us your comments or answers to the questions put to you in writing. Those answers must be sent to all committee members without delay, and they will be taken into consideration.
    Again, if you wish to be invited back, I will personally ask that we invite you at a future meeting. At that time, we'll send you the right headsets.
    I apologize on behalf of our committee. If you are able to send us your observations and comments in writing, I would appreciate it.
    Please know that we take this issue very seriously. It's important to know that interpreters have suffered damage to their hearing. I didn't know it could happen. I learned that during the pandemic. Since then, we've been using videoconferencing a lot, and we need to protect our interpreters, whom we really need to do our job.
    Do let me know if you want to testify again. I will make sure that you are invited at a later date.
    Thank you.
    I had one minute of speaking time left, and I'm going to allow myself to ask my question right away.
    Ms. Laidlaw, when it comes to maximum sentences, we're talking about life sentences for certain aggravated offences related to hate speech. For example, someone who committed a hate-related offence could be subject to a life sentence.
    In your opinion, is this the right sentence here? Should we review that provision?
(1750)

[English]

     This should be revised. I think that the risk of life imprisonment for a speech crime is wholly disproportionate.

[Translation]

    I see Mr. Hatfield nodding. I'm very happy.
    Thank you, Ms. Laidlaw.
    My time is up, so we'll go back to Mr. Julian.
    Mr. Julian, you have the floor for six minutes.
    Thank you very much, Mr. Chair.
    I completely agree with what you said about the interpreters. You made an important and fair call. The interpreters must be properly protected at all times. They really are one of the foundations of our Parliament. It's a shame to have to interrupt a witness's testimony, but you did the right thing and acted responsibly.

[English]

     I want to come to Ms. Laidlaw.
    Thank you very much for your presentation today. You talked about algorithm issues as well. I want to come back to you on that issue.
    We have before Parliament, as you know, Bill C-292, which is an act respecting transparency for online algorithms. To what extent do you think—in addition to part 1 of the bill—we need to look at legislation or perhaps need to incorporate portions of that bill to ensure that we actually have a situation where Canadians can see the transparency around algorithms, and in that way, as well, we can ensure greater safety?
     Thank you for the question.
    I agree. Algorithmic accountability should be added to Bill C-63. Earlier when I spoke, I said that it's loosely covered, but it requires a leap of faith. We need more in the legislation because the duty to act responsibly would leave it to the digital safety commission to develop codes of practice and regulations. There is scope there for algorithmic transparency and algorithmic amplification to be covered, but that kind of digital safety by design and the algorithmic accountability need to be embedded in the legislation itself.
    The same goes for the children's provisions. It does cover algorithms when it comes to safety by design, but it's one very short provision. If we take your bill and some of the provisions in that, and if that becomes a blueprint to flesh out those parts of Bill C-63 in part 1, then I think that we would be in a good position
     Thank you very much for that.
    I want to ask the same question of Mr. Hatfield.
    Mr. Hatfield, you mentioned that Bill C-63, part 1, accomplishes more than a bill that has been raised around this table—Bill C-412—so you've resolved that for us. It's very clear that we should be putting the focus on Bill C-63, part 1.
    To what extent do you believe that algorithm transparency is also important to achieve, and to what extent would you like to see some of the provisions of Bill C-292 incorporated into Bill C-63, part 1?
    Yes, it would be healthy for Bill C-63 to go a bit further into looking into what algorithms are doing, from the framework of providing more transparency, giving researchers good access to study algorithms and determining how they're impacting the public. If MPs could agree on some language there, could get it done and could move on to the rest of the bill, then that would be healthy. I wouldn't hinge the bill's future on it, but I think that would be the appropriate approach.
     I'll ask both of you this question. What are the international examples that we need to be considering as we deal with the bill, part 1, to ensure that we not only get the best possible bill coming out of committee, but also get one that does what needs to be done at this critical time?
    I'll start with you, Ms. Laidlaw.
    I recommend that the committee first study the Digital Services Act. The one thing to keep in mind is that these types of European legislation tend to be a lot shorter and leave a lot until later. I think that Bill C-63 is a little more fulsome, but especially for the algorithmic accountability, with the way it's been addressed there, it could be helpful here.
    The other thing to consider for some aspects of it would be the U.K.'s Online Safety Act. We have also drawn certain aspects from Australia's eSafety Commissioner structure. I can't remember the name of the legislation at the moment.
    Those are the three that I would recommend that you look at.
(1755)
    We can learn from all of that. Essentially, as we look at international examples, hopefully they'll look at what we put forward. We're hopefully raising a higher standard, not impacting on freedom of expression, but putting in place the protections that are so important.
    I'll go to you, Mr. Hatfield, with the same question on the international examples that we need to consider as we work through this bill to get the best possible result.
     For the international examples, I have to mirror what Ms. Laidlaw said. I think it is the DSA first and then looking at what some of what our Commonwealth peers have done in Australia and the U.K.
     From the perspective of partisanship, in the U.K., it was a Conservative government that moved through a bill that had some of the same parameters as Canada's bill. I would encourage everyone to remember that.
     If we were simply stacking Bill C-412 with no changes onto Bill C-63 with no changes—every part, with parts 2 and 3 included against each other—in that contest, OpenMedia would prefer Bill C-412. The exciting opportunity you have here, given that we're looking at just parts 1 and 4 potentially, is to strengthen and pass a version of Bill C-63, which, I think, of the Canadian examples, provides the best overall protection.
    You've been very clear that in terms of limiting the regulator, you believe we need to ensure that there are mandatory impacts on freedom of expression and privacy that are part and parcel of every decision that is made. On a scale of one to 10, how important is that to you, with 10 being...?
    It's necessary for the bill to function. We would have concerns if it weren't dealt with.

[Translation]

    Mr. Chair, may I continue?
    I would have liked to let you continue, Mr.  Julian, but your time was up 10 seconds ago. Thank you.
    It is now 5:57 p.m., and we have to end the meeting at no later than 6:00 p.m.
    I would therefore like to thank all the witnesses for being with us.
    I would also like to thank our colleagues from the House who ensured that this meeting was held properly.
    We'll see each other soon.
    The meeting is adjourned.
Publication Explorer
Publication Explorer
ParlVU