:
Mr. Speaker, I am rising to respond to a question of privilege raised by the member for on February 26 regarding the alleged premature disclosure of the content of Bill , the online harms act.
I would like to begin by stating that the member is incorrect in asserting that there has been a leak of the legislation, and I will outline a comprehensive process of consultation and information being in the public domain on this issue long before the bill was placed on notice.
Online harms legislation is something that the government has been talking about for years. In 2015, the government promised to make ministerial mandate letters public, a significant departure from the secrecy around those key policy commitment documents from previous governments. As a result of the publication of the mandate letters, reporters are able to use the language from these letters to try to telegraph what the government bill on notice may contain.
In the 2021 Liberal election platform entitled “Forward. For Everyone.”, the party committed to the following:
Introduce legislation within its first 100 days to combat serious forms of harmful online content, specifically hate speech, terrorist content, content that incites violence, child sexual abuse material and the non-consensual distribution of intimate images. This would make sure that social media platforms and other online services are held accountable for the content that they host. Our legislation will recognize the importance of freedom of expression for all Canadians and will take a balanced and targeted approach to tackle extreme and harmful speech.
Strengthen the Canada Human Rights Act and the Criminal Code to more effectively combat online hate.
The December 16, 2021, mandate letter from the to the Minister of Justice and Attorney General of Canada asked the minister to achieve results for Canadians by delivering on the following commitment:
Continue efforts with the Minister of Canadian Heritage to develop and introduce legislation as soon as possible to combat serious forms of harmful online content to protect Canadians and hold social media platforms and other online services accountable for the content they host, including by strengthening the Canadian Human Rights Act and the Criminal Code to more effectively combat online hate and reintroduce measures to strengthen hate speech provisions, including the re-enactment of the former Section 13 provision. This legislation should be reflective of the feedback received during the recent consultations.
Furthermore, the December 16, 2021, mandate letter from the to the Minister of Canadian Heritage also asked the minister to achieve results for Canadians by delivering on the following commitment:
Continue efforts with the Minister of Justice and Attorney General of Canada to develop and introduce legislation as soon as possible to combat serious forms of harmful online content to protect Canadians and hold social media platforms and other online services accountable for the content they host. This legislation should be reflective of the feedback received during the recent consultations.
As we can see, the government publicly stated its intention to move ahead with online harms legislation, provided information on its plan and consulted widely on the proposal long before any bill was placed on the Notice Paper.
I will now draw to the attention of the House just how broadly the government has consulted on proposed online harms legislation.
Firstly, with regard to online consultations, from July 29 to September 25, 2021, the government published a proposed approach to address harmful content online for consultation and feedback. Two documents were presented for consultation: a discussion guide that summarized and outlined an overall approach, and a technical paper that summarized drafting instructions that could inform legislation.
I think it is worth repeating here that the government published a technical paper with the proposed framework for this legislation back in July 2021. This technical paper outlined the categories of proposed regulated harmful content; it addressed the establishment of a digital safety commissioner, a digital safety commission, regulatory powers and enforcement, etc.
Second is the round table on online safety. From July to November 2022, the conducted 19 virtual and in-person round tables across the country on the key elements of a legislative and regulatory framework on online safety. Virtual sessions were also held on the following topics: anti-Semitism, Islamophobia, anti-Black racism, anti-Asian racism, women and gender-based violence, and the tech industry.
Participants received an information document in advance of each session to prepare for the discussion. This document sought comments on the advice from the expert advisory group on online safety, which concluded its meetings on June 10. The feedback gathered from participants touched upon several key areas related to online safety.
Third is the citizens' assembly on democratic expression. The Department of Canadian Heritage, through the digital citizen initiative, is providing financial support to the Public Policy Forum's digital democracy project, which brings together academics, civil society and policy professionals to support research and policy development on disinformation and online harms. One component of this multi-year project is an annual citizens' assembly on democratic expression, which considers the impacts of digital technologies on Canadian society.
The assembly took place between June 15 and 19, 2023, in Ottawa, and focused on online safety. Participants heard views from a representative group of citizens on the core elements of a successful legislative and regulatory framework for online safety.
Furthermore, in March 2022, the government established an expert advisory group on online safety, mandated to provide advice to the Minister of Canadian Heritage on how to design the legislative and regulatory framework to address harmful content online and how to best incorporate the feedback received during the national consultation held from July to September 2021.
The expert advisory group, composed of 12 individuals, participated in 10 weekly workshops on the components of a legislative and regulatory framework for online safety. These included an introductory workshop and a summary concluding workshop.
The government undertook its work with the expert advisory group in an open and transparent manner. A Government of Canada web page, entitled “The Government's commitment to address online safety”, has been online for more than a year. It outlines all of this in great detail.
I now want to address the specific areas that the raised in his intervention. The member pointed to a quote from a CBC report referencing the intention to create a new regulator that would hold online platforms accountable for harmful content they host. The same website that I just referenced states the following: “The Government of Canada is committed to putting in place a transparent and accountable regulatory framework for online safety in Canada. Now, more than ever, online services must be held responsible for addressing harmful content on their platforms and creating a safe online space that protects all Canadians.”
Again, this website has been online for more than a year, long before the bill was actually placed on notice. The creation of a regulator to hold online services to account is something the government has been talking about, consulting on and committing to for a long period of time.
The member further cites a CBC article that talks about a new regulatory body to oversee a digital safety office. I would draw to the attention of the House the “Summary of Session Four: Regulatory Powers” of the expert advisory group on online safety, which states:
There was consensus on the need for a regulatory body, which could be in the form of a Digital Safety Commissioner. Experts agreed that the Commissioner should have audit powers, powers to inspect, have the powers to administer financial penalties and the powers to launch investigations to seek compliance if a systems-based approach is taken—but views differed on the extent of these powers. A few mentioned that it would be important to think about what would be practical and achievable for the role of the Commissioner. Some indicated they were reluctant to give too much power to the Commissioner, but others noted that the regulator would need to have “teeth” to force compliance.
This web page has been online for months.
I also reject the premise of what the member for stated when quoting the CBC story in question as it relates to the claim that the bill will be modelled on the European Union's Digital Services Act. This legislation is a made-in-Canada approach. The European Union model regulates more than social media and targets the marketplace and sellers. It also covers election disinformation and certain targeted ads, which our online harms legislation does not.
The member also referenced a CTV story regarding the types of online harms that the legislation would target. I would refer to the 2021 Liberal election platform, which contained the following areas as targets for the proposed legislation: “hate speech, terrorist content, content that incites violence, child sexual abuse material and the non-consensual distribution of intimate images.” These five items were the subject of the broad-based and extensive consultations I referenced earlier in my intervention.
Based on these consultations, a further two were added to the list to be considered. I would draw the attention of the House to an excerpt from the consultation entitled, “What We Heard: The Government’s proposed approach to address harmful content online”, which states, “Participants also suggested the inclusion of deep fake technology in online safety legislation”. It continues, “Many noted how child pornography and cyber blackmailing can originate from outside of Canada. Participants expressed frustration over the lack of recourse and tools available to victims to handle such instances and mentioned the need for a collaborative international effort to address online safety.”
It goes on to state:
Some respondents appreciated the proposal going beyond the Criminal Code definitions for certain types of content. They supported the decision to include material relating to child sexual exploitation in the definition that might not constitute a criminal offence, but which would nevertheless significantly harm children. A few stakeholders said that the proposal did not go far enough and that legislation could be broader by capturing content such as images of labour exploitation and domestic servitude of children. Support was also voiced for a concept of non-consensual sharing of intimate images.
It also notes:
A few respondents stated that additional types of content, such as doxing (i.e., the non-consensual disclosure of an individual’s private information), disinformation, bullying, harassment, defamation, conspiracy theories and illicit online opioid sales should also be captured by the legislative and regulatory framework.
This document has been online for more than a year.
I would also point to the expert advisory group's “Concluding Workshop Summary” web page, which states:
They emphasized the importance of preventing the same copies of some videos, like live-streamed atrocities, and child sexual abuse, from being shared again. Experts stressed that many file sharing services allow content to spread very quickly.
It goes on to say:
Experts emphasized that particularly egregious content like child sexual exploitation content would require its own solution. They explained that the equities associated with the removal of child pornography are different than other kinds of content, in that context simply does not matter with such material. In comparison, other types of content like hate speech may enjoy Charter protection in certain contexts. Some experts explained that a takedown obligation with a specific timeframe would make the most sense for child sexual exploitation content.
It also notes:
Experts disagreed on the usefulness of the five categories of harmful content previously identified in the Government’s 2021 proposal. These five categories include hate speech, terrorist content, incitement to violence, child sexual exploitation, and the non-consensual sharing of intimate images.
Another point is as follows:
A few participants pointed out how the anonymous nature of social media gives users more freedom to spread online harm such as bullying, death threats and online hate. A few participants noted that this can cause greater strain on the mental health of youth and could contribute to a feeling of loneliness, which, if unchecked, could lead to self-harm.
Again, this web page has been online for more than a year.
The member further cites the CTV article's reference to a new digital safety ombudsperson. I would point to the web page of the expert advisory group for the “Summary of Session Four: Regulatory Powers”, which states:
The Expert Group discussed the idea of an Ombudsperson and how it could relate to a Digital Safety Commissioner. Experts proposed that an Ombudsperson could be more focused on individual complaints ex post, should users not be satisfied with how a given service was responding to their concerns, flags and/or complaints. In this scheme, the Commissioner would assume the role of the regulator ex ante, with a mandate devoted to oversight and enforcement powers. Many argued that an Ombudsperson role should be embedded in the Commissioner’s office, and that information sharing between these functions would be useful. A few experts noted that the term “Ombudsperson” would be recognizable across the country as it is a common term and [has] meaning across other regimes in Canada.
It was mentioned that the Ombudsperson could play more of an adjudicative role, as distinguished from...the Commissioner’s oversight role, and would have some authority to have certain content removed off of platforms. Some experts noted that this would provide a level of comfort to victims. A few experts raised questions about where the line would be drawn between a private complaint and resolution versus the need for public authorities to be involved.
That web page has been online for months.
Additionally, during the round table on online safety and anti-Black racism, as the following summary states:
Participants were supportive of establishing a digital safety ombudsperson to hold social media platforms accountable and to be a venue for victims to report online harms. It was suggested the ombudsperson could act as a body that takes in victim complaints and works with the corresponding platform or governmental body to resolve the complaint. Some participants expressed concern over the ombudsperson's ability to process and respond to user complaints in a timely manner. To ensure the effectiveness of the ombudsperson, participants believe the body needs to have enough resources to keep pace with the complaints it receives. A few participants also noted the importance for the ombudsperson to be trained in cultural nuances to understand the cultural contexts behind content that is reported to them.
That web page has been online for more than a year.
Finally, I would draw the attention of the House to a Canadian Press article of February 21, 2024, which states, “The upcoming legislation is now expected to pave the way for a new ombudsperson to field public concerns about online content, as well as a new regulatory role that would oversee the conduct of internet platforms.” This appeared online before the bill was placed on notice.
Mr. Speaker, as your predecessor reiterated in his ruling on March 9, 2021, “it is a recognized principle that the House must be the first to learn the details of new legislative measures.” He went on to say, “...when the Chair is called on to determine whether there is a prima facie case of privilege, it must take into consideration the extent to which a member was hampered in performing their parliamentary functions and whether the alleged facts are an offence against the dignity of Parliament.” The Chair also indicated:
When it is determined that there is a prima facie case of privilege, the usual work of the House is immediately set aside in order to debate the question of privilege and decide on the response. Given the serious consequences for proceedings, it is not enough to say that the breach of privilege or contempt may have occurred, nor to cite precedence in the matter while implying that the government is presumably in the habit of acting in this way. The allegations must be clear and convincing for the Chair.
The government understands and respects the well-established practice that members have a right of first access to the legislation. It is clear that the government has been talking about and consulting widely on its plan to introduce online harms legislation for the past two years. As I have demonstrated, the public consultations have been wide-ranging and in-depth with documents and technical papers provided. All of this occurred prior to the bill's being placed on notice.
Some of the information provided by the member for is not even in the bill, most notably the reference to its being modelled on the European Union's Digital Services Act, which is simply false, as I have clearly demonstrated. The member also hangs his arguments on the usage of the vernacular “not authorized to speak publicly” in the media reports he cites. It is certainly not proof of a leak, especially when the government consulted widely and publicly released details on the content of the legislative proposal for years before any bill was actually placed on notice.
The development of the legislation has been characterized by open, public and wide-ranging consultations with specific proposals consulted on. This is how the was able to proclaim, on February 21, before the bill was even placed on notice, that he and his party were vehemently opposed to the bill. He was able to make this statement because of the public consultation and the information that the government has shared about its plan over the last two years. I want to be clear that the government did not share the bill before it was introduced in the House, and the evidence demonstrates that there was no premature disclosure of the bill.
I would submit to the House that consulting Canadians this widely is a healthy way to produce legislation and that the evidence I have presented clearly demonstrates that there is no prima facie question of privilege. It is our view that this does not give way for the Chair to conclude that there was a breach of privilege of the House nor to give the matter precedence over all other business of the House.