Skip to main content

CHPC Committee Report

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

PDF

Tech Giants’ Intimidation and Subversion Tactics to Evade Regulation in Canada and Globally

 

Introduction

Motion Passed by the Standing Committee on Canadian Heritage

On 20 March 2023, the House of Commons Standing Committee on Canadian Heritage (the Committee) adopted the following motion:

That, given the dominant market position of Meta and Google and each company’s recent actions in Canada which appear to be attempts to intimidate Parliament and which follow a pattern of repeated subversive tactics used by tech giants across the world to prevent accountability, the committee undertake a study on tech giants’ current and ongoing use of intimidation and subversion tactics to evade regulation in Canada and across the world, and that as such, the committee hold a minimum of 5 meetings; and that, as part of this study;
  • I) The committee summon Meta executives to testify following their renewed threat to leave the Canadian news market;
    • (a) That Mark Zuckerberg, Chairman of Meta Platforms Inc., Nick Clegg, President of Global Affairs and Chris Saniga, Head of Canada for Meta, be ordered to appear before the committee for no less than two hours at a publicly televised meeting;
    • (b) That Meta Platforms Inc., and its subsidiaries, be ordered to provide:
  • (i) All internal and external communications (including but not limited to emails, texts or other forms of messages), save and except direct communications with individual Canadians back and forth, related to actions it planned to take or options it considered or is considering in relation to all Canadian regulation since April 5, 2022, including that under Bill C-18, including but not limited to, restricting the sharing of news content on its platforms in Canada.
  • (ii) Any internal documents, memos or internal communications relating to the impact of the company on the Canadian journalism sector since April 5, 2022.
That this be delivered to the committee no later than 5PM ET on March 31st, 2023.
  • II) The Committee notes that in accordance with its motion adopted on February 28th, 2023, on its study of the activities of Google in reaction to Bill C-18, the Committee received a letter on March 17, 2023 whereby Kent Walker, President of Global Affairs and Chief Legal Officer at Alphabet Inc. and Richard Gingras, Vice-President of News at Google have agreed to appear before the Committee for no less than two hours at a public televised meeting. The Committee shall incorporate that meeting into this study.
  • III) That a minimum of two meetings be allocated to hear from government officials, civil society and experts from other jurisdictions including, but not limited to, the European Union and Australia that have experienced tactics similar to those being used in Canada.
  • IV) That one meeting be allocated to the study of tech giants’ abuse of power around the world; that domestic and international antitrust and competition experts be invited to testify as to tech giants’ anticompetitive behaviors and abuse of market dominance in multiple jurisdictions, with a specific focus on harms to consumers, the news and cultural industries.[1]

The Committee held meetings to study the issue on 8 May, 28 November, 5 December, 7 December and 14 December 2023. As per paragraph (II) of the motion, the Committee incorporated the meeting with Google executives on 20 April 2023 into the study.

Context of the Study

Transformation of the News Media Industry

Media outlets in Canada and around the world have experienced years of major declines in revenue. In 2017, The Shattered Mirror: News, Democracy and Trust in the Digital Age, a landmark report published by the Public Policy Forum, surveyed the disruption in news media precipitated by the rise of digital media, confirming that advertising revenues were increasingly migrating away from the news outlets and into the hands of the digital companies, notably Google and Facebook.[2] The House of Commons Standing Committee on Canadian Heritage came to the same conclusion. In its report entitled Disruption: Change and Churning in Canada’s Media Landscape, released in June 2017, the Committee noted that Canada’s media landscape had “changed radically in recent years.”[3] Canadians were turning to digital platforms for their news and media content, and traditional media platforms (print, television and radio) were under financial pressure.

Bill C-18, An Act Respecting Online Communications Platforms That Make News Content Available to Persons in Canada

Bill C-18, An Act respecting online communications platforms that make news content available to persons in Canada (short title: the Online News Act)[4] was introduced in the House of Commons by former Minister of Canadian Heritage, the Honourable Pablo Rodriguez, on 5 April 2022. Its purpose was to rebalance power dynamics in the digital news marketplace in order to ensure fair compensation for Canadian media outlets and journalists. Bill C‑18 was considered by the Standing Committee on Canadian Heritage between September and December 2022[5] and by the Standing Senate Committee on Transport and Communications between April and June 2023.[6] The Act received Royal Assent on 22 June 2023.[7]

The Online News Act creates a legislative and regulatory framework enabling digital news intermediaries to negotiate agreements with Canadian news media to authorize them to disseminate Canadian news content on their platforms. It sets up a process that enables smaller media outlets to bargain collectively. It also expands the mandate of the Canadian Radio-Television and Telecommunications Commission (CRTC) by giving it responsibility for developing the code of conduct governing the bargaining process and by mandating it to determine whether agreements made outside the bargaining process meet the conditions for exemption.[8]

The proposed Regulations respecting the application of the Online News Act, the Duty to Notify and the Request for Exemptions were posted in the Canada Gazette on 2 September 2023 for a consultation period ending 1 October 2023.[9]

The final regulations, the Online News Act Application and Exemption Regulations, were posted in the Canada Gazette on 15 December 2023.[10]

According to the final regulations, platforms subject to the framework must earn a global revenue of $1 billion or more (CAD) in a calendar year; operate a search engine or social media market involving the sharing of news content in Canada; and have more than 20 million Canadian average monthly unique visitors or average monthly users. Platforms that meet the criteria must notify the CRTC within 180 days, during which time they can bargain with news businesses and seek an exemption. The final regulations also provide direction to the CRTC on interpreting the criteria in the Act to determine whether a platform qualifies for exemption.[11]

Between 13 March and 12 April 2024, the CRTC held public consultations on its own regulations concerning the exemption process, the code of conduct, and the eligibility of news businesses, among other things.[12] The CRTC “[expects] to start publishing the decisions that set out this important regulatory framework later this year.”[13]

Google’s and Meta’s Actions in Response to Bill C-18

The responses of Google and Meta, the two digital news intermediaries that would be subject to the Online News Act, are what prompted the present study.

Notably, both Google and Meta blocked access to, or threatened to block access to, news in Canada in order to avoid paying publishers to share news content under the Online News Act.

Google’s Actions in Response to Bill C-18

Google appeared before the Committee on 18 October 2022 as part of its study of Bill C‑18 and expressed its concerns with the legislation. The company also proposed several amendments to the bill.[14]

Colin McKay, Head, Public Policy and Government Relations for Google Canada, stated that it “sends billions of visits to Canadian news publishers a year at no cost to them, helping them grow their readership and subscriber base.”[15]

Mr. McKay said the bill “defines eligible news businesses extremely broadly and does not require a publisher to adhere to basic journalistic standards,” potentially “[leading] to the proliferation of misinformation and clickbait.”[16]

He also expressed concern about the “undue preference” provision of the bill, which he said would “restrict Google and other platforms from applying policies and providing features that elevate trusted information sources over lower-quality content.”[17]

Mr. McKay also objected to the idea of payment for links, saying such a provision “violates global copyright norms and local legal precedent” and “incentivizes cheap, low-quality clickbait content over public-interest journalism” and also “favours large publishers over small ones as they simply have more content to link to.”[18]

He said that a “fund similar to the Canada Media Fund would resolve the issues we have raised and would ensure that a diversity of Canadian news and publishers receive money in a timely, equitable and transparent manner.”[19]

On 22 February 2023, Google confirmed to The Canadian Press that it was rolling out product tests blocking access to news content for some users in Canada. The company said it was “briefly testing potential product responses to Bill C-18” and that the tests would impact” a random sampling of less than 4% of users in Canada by “[limiting] the visibility of Canadian and international news to varying degrees.”[20]

In response, on 28 February 2023, the Committee adopted a motion to “undertake a study into the activities of Google in reaction to Bill C-18” and summon a number of senior executives to testify. The committee also requested documentation relating to the product tests.[21]

Sabrina Geremia, Vice President and Country Manager at Google Canada, and Jason Kee, Public Policy Manager at Google Canada, appeared before the Committee on 10 March 2023. They told the Committee that Google “runs over 11,500 tests each year,” that “[no] decisions have been made about product change” and that the company “remains committed to working constructively with the government on reasonable and balanced solutions that would fix Bill C-18 and contribute to a healthy, innovative and diverse news ecosystem for the digital age.”[22]

On 29 June 2023, following the passage of the Online News Act, Kent Walker, President of Global Affairs at Google and Alphabet, said in a Google Canada blog post that the company had told the government that it would remove links to Canadian news from its Search, News and Discover products in Canada, and discontinue its Google News Showcase product in Canada, once the law took effect. The company said that the government had “not provided us with sufficient certainty that the regulatory process will be able to resolve structural issues with the legislation (such as forced payment for links and uncapped financial liability.)” Mr. Walker stated that the company would participate in the regulatory process.[23]

On 29 November 2023, the Minister of Canadian Heritage announced that the government had reached an agreement with Google whereby the company would contribute $100 million annually, indexed to inflation, to news businesses across Canada.[24]

Government officials announced on 15 December 2023 that CBC/Radio-Canada would receive a maximum of seven per cent of the fund, while 30 per cent would be split between other broadcasters, and that the remaining 63 per cent would be distributed to print and digital outlets.[25]

Meta’s Actions in Response to Bill C-18

Meta detailed its concerns about Bill C-18 and warned that it would reconsider “whether we continue to allow the sharing of news content in Canada” in a blog post on 21 October 2022.[26]

Meta said in the post that it does not “unfairly [benefit] from its relationship with news publishers” and has “collaborated ... with Canadian news providers to invest in partnerships and programs that support the development of sustainable business models for news organisations.” The company also said it has “sent registered publishers more than 1.9 billion clicks in a single year” and that C-18 asks them to “acquiesce to a system that lets publishers charge us for as much content as they want to supply at a price with no clear limits.”[27]

Meta said the bill “unfairly subsidises legacy media companies that have struggled to adapt to the online environment” and that this approach would “harm competition” and “make the transition to digital models even more difficult.”[28]

Meta appeared before the Committee on 28 October 2022 and reiterated its concerns.[29]

Meta reiterated its intention to block news in Canada in response to Bill C‑18 on 11 March 2023, following Google’s appearance before the Committee on 10 March, and prompting in part the Committee’s motion to undertake the present study.[30]

Meta appeared before the Committee again on 8 May 2023 as part of the present study and confirmed its intention to “end the availability of news content on Facebook and Instagram in Canada.”[31]

On 1 June 2023, Meta announced it would “begin tests on both platforms that will limit some users and publishers from viewing or sharing some news content in Canada.” The tests would “run for several weeks” and affect “a small percentage of people in Canada,” according to the company’s statement. Meta framed the plan to end news availability as a business decision and said it would “end the availability of news content in Canada permanently following the passage of Bill C‑18.”[32]

In an update on 1 August 2023, Meta announced it had “begun the process of ending news availability in Canada.” The company said it was “identifying news outlets based on legislative definitions and guidance from the Online News Act.”[33]

The company reiterated that the move was a “business decision”[34] and that the ongoing regulatory process was “not equipped to make changes to the fundamental features of the legislation that have always been unworkable”[35] for Meta.

The Department of Canadian Heritage reportedly approached Meta in December 2023 to resume negotiations,[36] but based on information available to the Committee, at the time of writing, no agreement has been reached with the company, and Meta continues to block access to news content in Canada.

What the Committee Heard

Google’s Testimony

On 28 February 2023, in response to news that Google was running tests affecting the availability of news on its platforms for some Canadian users, the Committee adopted the following motion:

That pursuant to Standing Order 108(2), the committee undertake a study into the activities of Google in reaction to Bill C-18 including but not limited to the decision by Google to test the blocking of news sites in Canada;
That pursuant to Standing Order 108 (1)(a) the committee summon Sundar Pichai, Chief Executive Officer of Alphabet Inc., Kent Walker, President of Global Affairs and Chief Legal Officer at Alphabet Inc., Richard Gingras, Vice-President of News at Google and Sabrina Geremia, VP and Country Manager for Google in Canada, to testify for a two hour meeting on Monday, March 6, 2023;
That the committee order Alphabet Inc. and all of its subsidiaries including Google to provide:
  • a) any and all internal or external communications (including but not limited to emails, texts or other forms of messages) related to actions it planned to take or options it considered in relation to Canada’s Bill C-18, including but not limited to those in relation to the testing of the blocking of news sites in Canada;
  • b) the list of all news organizations blocked by Google, in Canada;
That this be delivered to the committee no later than 5PM EST on Thursday, March 2, 2023.[37]

According to the motion adopted by the Committee on 20 March, 2023:

(II) The Committee notes that in accordance with its motion adopted on February 28th, 2023, on its study of the activities of Google in reaction to Bill C‑18, the Committee received a letter on March 17, 2023 whereby Kent Walker, President of Global Affairs and Chief Legal Officer at Alphabet Inc. and Richard Gingras, Vice-President of News at Google have agreed to appear before the Committee for no less than two hours at a public televised meeting. The Committee shall incorporate that meeting into this study.[38]

Kent Walker, President of Global Affairs at Google LLC, and Richard Gingras, Vice‑President of News at Google LLC, appeared before the Committee on 20 April 2023.

Google reiterated its concerns about Bill C-18 and told the Committee it would continue to oppose the legislation.[39] Mr. Walker said there is a “better model”[40] for supporting journalism than the Online News Act.

Mr. Walker said that news “doesn’t have that much economic value” to Google and that news queries “tend to be the least monetizable.”[41] Mr. Gingras described Google News as “a newsstand that publishers don’t pay to be on” and that “delivers zero revenue” to the company. He said it would be “reasonable ... for any business to reconsider” the practice of linking to news content if forced to pay for such links.[42]

Mr. Gingras told the Committee that Google carries out “thousands of tests”[43] as a matter of business. He said the news blocking tests were designed to understand the impact of Bill C-18 and the company’s options in relation to it.[44]

The company said the testing had been “random”[45] and had not targeted any individuals or organizations in Canada.

Kent Walker characterized the tests as relating to a potential business decision should the company be made to pay for links, saying “when there’s a tariff or a fee for a good or service, businesses will naturally look to see whether they should provide as much of that good or service.”[46]

Mr. Gingras said the tests confirmed that “news queries are very small percentages to Google—less than 2%” and that the tests had “no impact” on users “with regard to non‑news inquiries.”[47]

Mr. Walker also said that Google was “one of the world’s biggest supporters of journalism”[48] and that Google is not “resisting participating”[49] in change that would ensure the continuity of the news industry.

Responding to questions about whether the company had engaged in “astroturfing”—a practice whereby corporations, instead of lobbying legislators and policymakers directly, enlist ostensibly independent third parties to promote their interests in the guise of “grassroots” support[50]—Mr. Walker said the company had made “efforts to allow a variety of stakeholders, who had their own concerns about the legislation, to have a seat at the table.”[51] He said Google does “not make payments for parties to astroturf” or “to YouTube creators to lobby on our behalf.”[52]

On Google’s dominance of the ad tech market, Mr. Walker said Google has offered a “significant opportunity for publishers around the world to find ways to do a better job of monetizing their digital content” and that the company’s services “would give them the large majority of the revenues that are coming from advertising.” He said that Google’s ad services “[create] more available revenue and advertising revenue to keep for publishers as they manage to transition into the digital age.”[53]

Meta’s Testimony

Kevin Chan, Global Policy Director for Meta Platforms, and Rachel Curran, Head of Public Policy at Facebook Canada, appeared before the Committee on 8 May 2023.

Sir Nicholas Clegg, President of Global Affairs at Meta, had previously accepted an invitation to appear at the hearing. However, Mr. Chan told the Committee that, because the title of the study had been changed to more accurately reflect the content of the motion, Sir Clegg had elected not to appear. Mr. Chan told the Committee that the new title was “much more confrontational” and that the company had been “looking forward to a substantive discussion about Bill C-18.”[54]

Mr. Chan read Sir Clegg’s opening statement, which reiterated Meta’s concerns about Bill C-18 and characterized the company’s decision to end news sharing as a “business decision.”[55]

Asked about serious problems[56] that had occurred when the company had stopped sharing news content in Australia, Ms. Curran said Meta was “preparing very carefully” to ensure that the same errors would not be made in Canada. She said the company is being “fully transparent”[57] with Canadians and parliamentarians about the process.

Ms. Curran explained to the Committee that news remained accessible on their platforms in Australia because the company is “not designated under the Australian legislation.” She said the Australian law “allowed time for a process to unfold whereby we could reach what we call an untidy and short-term compromise for news to remain on our platforms”[58] but that the Canadian legislation “doesn’t ... allow for any kinds of discussions like that, or for a process to unfold, before we are designated and subject to the framework contained in Bill C‑18.”[59]

Ms. Curran also reiterated Meta’s concerns about Bill C-18 with the Committee, saying that while “news has a real social value,” it does not have “much of an economic” value and that the company is “being asked to compensate news publishers for material that has no economic value to us.”[60]

Mr. Chan said that Meta had proposed a number of amendments to the Committee as well as the Standing Senate Committee on Transport and Communications during their study of the bill, and that the company had had “some” meetings with government representatives but that these had been unproductive.[61]

On the question of astroturfing, Mr. Chan said that Meta was not funding third party organizations to lobby on its behalf in Canada.[62] He said annual reports released in Canada every year contain details on “companies we have supported.”[63]

Mr. Chan also proclaimed Meta’s support for news businesses in general, noting that the company had entered into 18 agreements with news organizations, including small publishers.[64] He also said it had spent $8 million in Canada “with respect to programs with news publishers and partnerships”[65] as part of a global journalism project. Ms. Curran said a “central fund model” for news organizations would be “easier ... for [the company] to support” than the framework established by Bill C-18.[66]

Ms. Curran said that a “cross-functional team” was “working to understand the legislation and to prepare for the removal of news content on our platforms”[67] and that the team had not been asked to sign any non-disclosure agreements.[68]

On the question of online safety, Mr. Chan said the company has “very strict content policies that go well above the rule of law” and that cover “harmful content,” “terrorist content,” and “violent extremist content.”[69] Ms. Curran said the company’s “enforcement systems aren’t perfect, but they’re getting better every year, and we report on those results transparently and publicly ... so that Canadians and parliamentarians know that we’re holding ourselves to a certain standard.” She said, “we have large teams working around the world to remove content that’s forbidden by our community standards.”[70]

Tactics of the Tech Giants

Many witnesses, including academics and representatives from civil society, told the Committee about tactics used by corporations such as Google and Meta in response to regulation in other jurisdictions as well as in Canada.

They described a number of strategies, including lobbying (both directly and through “astroturfing”), funding friendly research, suppressing independent researchers, intimidating legislators and applying other types of pressure.

Georg Riekeles, Associate Director at the European Policy Centre, said the tech industry was using the same strategies as the tobacco industry before it:

It’s about lobbying. It’s about framing the narrative. It’s about creating alliances and setting up front groups and astroturfing campaigns. It’s about influencing or buying think tanks and academics. It’s about hospitality. It’s about political support and funding. It’s using philanthropy. It’s also about litigation and intimidation, and about the use of international pressure.[71]

Bram Vranken, a researcher at the Corporate Europe Observatory, said the aim of the tech companies is to “make sure that there are as few hard regulations as possible to preserve the profit margins and business model” and to “water down” any new rules that cannot be blocked.[72]

Imran Ahmed, Chief Executive Officer of the Centre for Countering Digital Hate, described Meta’s actions as “a temper tantrum by a company that has shown itself, at every opportunity, to be completely opposed to governance by democratically elected governments worldwide.”[73]

Dr. Courtney Radsch, Director of the Center for Journalism and Liberty at the Open Markets Institute, told the Committee that companies like Facebook “undermine democratic institutions, seek to handicap regulatory agencies and evade laws they don’t like.”[74]

Dr. Erik Peinert, Research Manager at the American Economic Liberties Project, explained to the Committee that the companies “see oversight and market governance as an existential threat to their predatory business models, and they react with hostility” to regulatory proposals, “with bullying, threats, and coercion.”[75]

Lobbying and Astroturfing

Several witnesses shared with the Committee their concerns about the lobbying power of the tech giants.

Concerns about the tech sector’s lobbying power are not new. In 2021, Reuters reported a 27% jump in U.S. lobbying activity by Google, in response to a “long list of bills ... aimed at reining [the tech giants] in.”[76]

In the first half of 2020, Google, Facebook, Amazon, Apple and Microsoft spent a combined 19 million euros on lobbying activities in Europe as the European Union (EU) scaled up its efforts to regulate the sector, prompting fears among observers of a “Washingtonization of Brussels” that would “[give] money and connections an upper hand over the public interest.”[77]

Also in 2020, a leaked Google document, “DSA 60-day plan update,” revealed an aggressive strategy to undermine progress toward the Digital Services Act.[78]

Georg Riekeles described the companies’ “direct and hidden lobbying” as being “of a brazenness and scale that … are totally out of line with the applicable codes of conduct for interest representation and the most basic behavioural principles in society.”[79]

Jason Kint, Chief Executive Officer at Digital Context Next, said that Google and Facebook “registered in the top 10 lobbyists in the EU and the U.S.,” and that “in addition to direct employee and campaign contributions, there is a long list of groups that champion the two companies’ talking points in return for significant funding.”[80]

Bram Vranken said that in the EU, the “top 10 digital corporations alone spend a total of 40 million euros a year on lobbying” and that Facebook alone spends 8 million, representing an “increase of a factor of 17” over its expenditures a decade ago, when the company spent 450,000 euros, according to him.[81]

Mr. Vranken told the Committee that the tech giants use their “massive funding” to “build very extensive networks of lobby groups and lobby consultancies, and provide funding to think tanks and universities,” creating a “gigantic lobbying echo chamber that constantly plays a variation of the same tune: Regulation will damage the economy, damage innovation and be bad for small and medium enterprises.” He referred to the leaked 2020 Google lobbying strategy document, highlighting Google’s “approach, which was, first of all, to mobilize third parties such as think tanks and academics to echo Google’s messages, and second, to reframe the political narrative around costs to the economy and consumers.”[82]

Mr. Vranken described the “insidious” lobbying strategy of “[funding] organizations claiming to represent small and medium enterprises (SME), startups, and software developers.” He said that “[in] one case, Apple provided more than half the funding for an organization claiming to represent app developers.” He cited another example in which “many of the member companies of a big-tech funded SME trade association did not know they were a member” and “did not agree with [the trade association’s] position.”[83]

Mr. Riekeles further described the companies’ use of “front groups and alliances” in the EU:

One example under the copyright debates deals with one of the most vocal stakeholder coalitions in Brussels, called C4C, the Coalition for Creativity, which represented all from public libraries to digital rights organizations. It turned out ex post that this coalition was financed by the Computer & Communications Industry Association, that is, financed indirectly by Google and other platforms. The coordinator was, by chance, also a consultant for Google.[84]

Mr. Riekeles told the Committee that tech regulations “understood in terms of enforcing a strict competition regime or rules to keep privacy invading platforms in check” were insufficient. He said that regulations needed to encompass the tech sector’s “capacity to influence private institutions, civil society and policy discourse.”[85]

Dr. Courtney Radsch told the Committee that “big tech companies spend more money in Washington, Brussels and other world capitals than virtually any other sector,” both “through direct lobbying” and “by funding industry groups and fellowships that help shape how policy-makers think about issues they regulate.” She said the tech sector “provides funding to most civil society, research and advocacy groups working in tech policy, digital rights, AI governance and the media bargaining code space, as well as journalism.”[86]

By contrast, Dr. Michael Geist, Canada Research Chair in Internet and E-Commerce Law at the University of Ottawa, said that in respect of Bill C-11 and Bill C-18 it was News Media Canada that had the “most registered lobbyist meetings” with the government, rather than Google or Meta.[87]

Blocking Independent Research

Some witnesses described the difficulties faced by researchers working to cast light on the platforms’ operations.

Dr. Joan Donovan, Online Disinformation and Misinformation Expert at Boston University College of Communication, told the Committee how a Facebook donation of “half a billion dollars” to the Harvard Kennedy School had “killed” the Technology and Social Change (TaSC) research project that she had headed. She said that in “deference to donor interests,” the dean of Harvard Kennedy School had terminated the TaSC project after “a well-known Facebook fixer became enraged in a donor meeting” over Dr. Donovan’s plan to create a “public collaborative archive” out of whistleblower Frances Haugen’s internal Facebook documents.[88]

Dr. Donovan said, “when a school like Harvard is complicit in the corporate direction of research, what can protect those of us who work to document, analyze and share the truth?”[89]

Jason Kint told the Committee about a group of researchers at New York University who were “blocked by Facebook” for “[trying] to expose some of the harms on the platform.”[90]

Another difficulty facing researchers is a lack of access to platform data needed to carry out research effectively, according to some witnesses.

Dr. Radsch said it is “difficult to even gain access to the data needed to do much of the research” and that, as such, “the platforms have a very dangerous hold on our ability to understand our information ecosystem, how information and communication circulate online, and of course how harassment plays out as well.” She said that the access problem will be worse in the case of artificial intelligence “whereby access to the massive data models and the computational power needed to do this research mean that it’s often only researchers who have some sort of link with a major tech company who are able to conduct this research or gain access to it.”[91]

Matthew Hatfield, Executive Director of OpenMedia, told the Committee that “the limited research that exists on how platform models may sometimes amplify harms is done with very incomplete data or with crumbs of researcher data access, which platforms are quick to withdraw if their interests are threatened.” He recommended that provisions for “academic researcher access” be included in Canada’s forthcoming online harms legislation.[92]

Nora Benavidez, Senior Counsel and Director of Digital Justice and Civil Rights at Free Press, told the Committee that one of the platforms’ tactics is “cutting off researcher and [application programming interface (API)] access to platform data.” She cited the example of the New York University Ad Observatory, which “was denied access by Facebook in 2021 ... following months of inquiry analyzing its ad library tools.” She said Twitter had placed a “high price tag” on researcher access to its API tool and said, “all of the major platforms require advance notice from researchers, who must be affiliated with universities to get access to their API. This sets up a de facto process whereby the platforms can approve or reject research access if they don’t like how the ultimate product will be used.”[93]

Funding Friendly Research

Large tech companies have been known to provide funding for research that is favourable to them. On 6 December 2023, the Tech Transparency Project, a nonprofit which describes itself as “an information and research hub for journalists, academics, policymakers and members of the public interested in exploring the influence of the major technology platforms on politics, policy, and our lives,”[94] revealed that “Mark Zuckerberg’s personal philanthropy and his company, Meta, have collectively donated hundreds of millions of dollars to more than 100 U.S. colleges and universities across the country, giving the CEO powerful potential leverage to influence the institutions.”[95]

A 2021 paper by researchers at Harvard University and the University of Toronto found that 52% of tenure-track computer science professors in top-tier schools “with known funding sources … have been directly funded by Big Tech,” including 58% of computer science faculty working in AI and in ethics, respectively.[96]

Dr. Joan Donovan told the Committee that Facebook “is providing contracts to researchers, not just at universities but also in civil society” in “an attempt to make academia and research into a wing of their own PR (public relations).”[97] She added that the contracts contain “kill clauses or veto clauses that say Facebook has the right to read your research prior to publication and to decide if they think it has met their privacy standards,” with “privacy” referring not only to that of the platform’s users but to the “corporate products” as well. Dr. Donovan said, “[i]f you’re a researcher and you want to study the algorithmic impact of Facebook’s products, you have to be very careful that you’re not also sharing what Facebook would consider trade secrets, or they could shut your research down if they were funding you.”[98]

Dr. Donovan also told the committee that Facebook “has executives who have taken up positions on advisory boards at universities across the U.S. and Canada,” and who “use that soft power and influence to direct research agendas.”[99]

By contrast, Dr. Geist said that while the Tech Transparency Project had previously identified “many papers and work by academics with links to, or financial backing from” Google, there were “virtually no Canadian examples.”[100]

Use of Platforms to Control Media Coverage

Some witnesses noted that the tech giants have an added advantage in the fight against regulation because the platforms they own are also a vehicle for them to promote their own positions.

Dr. Courtney Radsch said that the “manipulation” of the tech giants was “intensified through the use of their own platforms to manipulate public opinion and to censor their critics”:

[Tech] giants use their platforms to propagandize against regulation they oppose, distorting public perception and debate. We saw this in Australia, Canada, Brazil and the U.S. with news media bargaining legislation. Google used its search page to advocate against the proposed laws, and reportedly told evangelical preachers in Brazil that they would no longer be able to quote the Bible online. The Brazilian judiciary accused Google of undue influence in the legislative process.[101]

Mr. Riekeles said the tech companies use their power “directly” to gain influence and cited an example in the context of the EU’s copyright reform:

When the EU was trying to regulate user-generated content and confer ancillary copyrights on press publishers in 2018 and 19, big tech was directly corralling protestors to the barricades … YouTube’s chief executive, Susan Wojcicki, crassly told YouTube creators in a letter that the legislation posed a threat to both their livelihoods and their ability to share their voices, threatening hundreds of thousands of jobs and threatening the freedom of expression and the web as we know it.[102]

Mr. Riekeles noted that “the copyright directive took effect across Europe two years ago” and suggested that Google’s “dramatic warning that it would change the web as we know it” had not been substantiated.[103]

Mr. Kint said that “companies intimidate consumers in order to drive outrage, including by using their dominant gateways of YouTube, search and messaging” to “[claim] that regulations will destroy innovation or end the free and open internet.” He said, “Facebook often takes it a step further by suggesting it will have to charge for services or kill thousands of small businesses and millions of jobs.”[104]

Intimidation Tactics Aimed at Legislators

Mr. Kint cited Facebook’s blocking of news in Australia as a direct “threat to legislation,” timed to occur “during the most critical week of Parliament’s deliberations” on its News Media Bargaining Code.[105]

Mr. Kint also told the Committee that the big tech companies “[threaten] investments” as a way of intimidating lawmakers. He said an open records request in the U.K. had revealed “that Mark Zuckerberg threatened to pull back investment in the U.K. at a time when its Parliament was demanding he testify about questions” relating to the Cambridge Analytica scandal.[106]

By contrast, Philip Palmer, President of the Internet Society Canada Chapter told the Committee that Meta’s withdrawal from the Canadian news market was not “intimidation” but a “lawful and rational business decision.”[107]

Bill C-18, the Online News Act

A number of witnesses shared their views about the legislation that had prompted Google and Meta to consider ending news availability in Canada, as well as the $100 million fund negotiated with Google as a result of the legislation.

Sean Speer, editor-at-large at The Hub, said, “a model that doesn’t follow consumer signals or market signals but instead has either the government or … an industry association or … interlocutor between the individual media organizations and, in this particular case, Google,” means that “resources will be disproportionately directed to legacy media companies and not the parts of the sector that are growing and innovating.” He said this consequence was “inherent” in the “policy framework that has been established” by Bill C‑18.[108]

Dr. Geist also felt that the Online News Act, by “[going outside] of the [Qualified Canadian journalism organization] framework that we have around the labour journalism tax credit, ensured that broadcasters, which are dominated by a handful of large players in Canada, would be the major beneficiaries.”[109]

Some witnesses agreed with Google’s and Meta’s assertion that news content did not have economic value for the platforms. Matthew Hatfield told the Committee that Bill C-18 had wrongly assumed that “news has inherent value to platforms” whereas, “for Meta at least, it does not.”[110] Likewise, Dr. Geist said, “the idea that news content was something [the platforms] couldn’t live without never made much sense.”[111]

Several witnesses told the Committee that news outlets benefited from exposure on search and social media platforms. Jeff Elgie, Chief Executive Officer at Village Media Inc., said Village Media “willingly [pays] to allow for snippets of our content to appear on the platforms” and “[benefits] tremendously from” the resulting traffic. He said that such traffic had helped Village Media “grow and launch 25 publications and develop a profitable and sustainable model for local news.”[112]

Some witnesses disagreed that Google and Meta did not profit from news. Dr. Erik Peinert told the Committee that Google and Meta are “using their dual control over Internet traffic and advertising to monetize content that journalists produce at considerable expense.” He cited research from the University of Zurich “[indicating] that 40% of Google’s total revenue from search advertising would go to publishers and other journalism outlets if it faced more competition.”[113] Dr. Radsch cited an American study that estimated the platforms owed U.S. publishers $12 billion a year.[114]

A number of witnesses directly blamed Bill C-18 for Meta’s exit from the news market, which they said had caused significant damage to news organizations, including digital innovators. Mr. Elgie and Mr. Palmer both said the priority should be to “get Meta back.”[115]

Mr. Palmer characterized Meta’s ending of news availability in Canada as “a lawful and rational business decision” that “has proven to be a hardship for Canadian news producers.”[116]

Mr. Elgie said that Village Media had been profitable “over our 10 years of operation,” but that “as of April of [2023], in anticipation of the outcome of the Online News Act, and for the first time ever, our company has paused almost all new hirings and suspended new community launch plans.”[117]

Sean Speer told the Committee that Meta’s withdrawal from news as a consequence of Bill C-18 has cost many news organizations, including his own, “the ability to communicate, reach our current audience and grow it.”[118]

Dr. Geist said that both the Online News Act and the Online Streaming Act, as well as the Committee’s approach to those laws, had had “significant negative implications for access to foreign content for diaspora communities.” He said, “the increased cost of regulation and registration ... could well lead many foreign streaming services to simply block the Canadian market” and that diasporic communities “may be most directly affected.”[119]

By contrast, Jason Kint of Digital Content Next said his company “enthusiastically supported” Bill C-18.[120] He pointed out that “news was struggling before Bill C‑18 was passed [and] before Facebook pulled out” and that “traffic is down from Facebook and Meta across the board internationally.” He said Canada had “bravely passed legislation by looking at smart legislation elsewhere that is working” and that Meta’s actions could be characterized as a “temper tantrum.”[121]

Witnesses also shared their views and concerns about the $100 million fund negotiated between Google and the Government of Canada.

Mr. Elgie said the “ultimate value of the Google deal may in fact be less” than the value of prior agreements between publishers and the two tech platforms. He said many small publishers would likely “prefer to have their Meta traffic back” than to receive an as yet undetermined amount from the Google fund, and that “even the best-case scenario for the Google deal likely does not make up for the value of lost Meta traffic” for Village Media.[122]

Peter Menzies told the Committee that as a result of Bill C-18, “we now unfortunately have a news ecosystem in which most of our journalists could soon have at least half of their pay dependent on the government, Google, and any other offshore money the CRTC might come up with as a result of hearings [on the Online Streaming Act].” He said this would create a perception that “news organizations are fatally compromised” by their funding sources. “As a result,” he said, “the public’s faith in journalists will continue to wither, and trust in journalism will eventually die.”[123]

Mr. Hatfield expressed his concern that “under Bill C-18 ... funding is going primarily to news organizations that, to some degree, are already succeeding and still exist” rather than to areas where it is more urgently needed.[124]

Dr. Peinert said the $100 million deal negotiated with Google “simply confirms” and “acknowledges the value the platforms gain from journalism.”[125]

Dr. Courtney Radsch said that the Google fund of $100 million was not “at all on par with what is actually owed” and pointed to a “myopic focus on the value of referral traffic” as part of the problem. She said the “tech companies have been very successful in arguing that we should have this very narrow conception of how we establish value.”[126]

Dr. Radsch said that Bill C-18 also fails to “account for generative AI and the role that news plays in large language models and AI systems.” She recommended “looking at a wider array of tech companies that could be covered and required to contribute to the fund.”[127]

Several witnesses said they felt that CBC/Radio-Canada should not be included in the Google fund, or that it should not receive an amount proportionate to its size. Mr. Menzies said the Corporation should not be a “recipient of the Google fund,”[128] Mr. Palmer said Internet Society Canada Chapter does “not favour the Google funds going to CBC at this time,”[129] and Mr. Elgie of Village Media said, “we feel this money is best directed to the private sector.”[130]

Some witnesses felt the study of the legislation had not been adequate. Dr. Geist told the Committee that its study of Bill C-18 had not engaged a “wide range of people—both supportive and critical of the legislation.”[131] He said that “too often committee is set up more as consultation theatre than as actual, real, engaged consultation” and that “the notion of making changes … is somehow seen as an admission of some sort of failure.”[132]

Section 19 of the Income Tax Act

At the request of certain Committee members, several witnesses gave their opinions on the various support measures for Canadian media.

One measure that was mentioned was section 19 of the Income Tax Act, which includes a number of restrictions limiting the tax deductibility of certain expenses related to advertising to the Canadian market. Under this section, advertising expenses for advertisements in foreign newspapers or periodicals or in foreign electronic media are generally not tax deductible when the advertisements are primarily directed to a market in Canada. However, “no such restrictions exist for advertising that Canadian businesses purchase from foreign websites.”[133] Those advertising expenses are fully tax deductible. Note that no financial data on the costs of this tax measure are available in Finance Canada’s most recent report on tax expenditures (2023).[134]

Some witnesses suggested eliminating the tax deductibility of advertising costs paid to foreign companies such as Facebook and Google. Erik Peinert of the American Economic Liberties Project,[135] Pierre Trudel, law professor at the Public Law Research Center of the of the Université de Montréal[136] and Jean-Hugues Roy of the Université du Québec à Montréal[137] were in favour of eliminating subsidies to digital news intermediaries. Marc Hollin of Unifor was also opposed to providing “financial incentives”[138] to these companies. Courtney Radsch of the Open Markets Institute stated that she was shocked “to hear that the Canadian government is subsidizing the wealthiest companies in the world.”[139]

Dr. Geist, however, disagreed that section 19 of the Income Tax Act constituted a subsidy. He said: “It’s a deduction for businesses that advertise. The idea that we would eliminate the ability for those businesses to effectively advertise in places makes them less competitive, it seems to me.”[140]

National Public Broadcaster and Digital News Intermediaries

During the study, the role of CBC/Radio-Canada in a changing media landscape was raised.

Several witnesses indicated that it was a good idea for Canada to have a national public broadcaster. Pierre Trudel of the Université de Montréal stated that public broadcasting is essential for providing minorities, including Indigenous peoples, with access to news. However, Mr. Trudel believes that there is a need to “reinvent [the] public service.”[141] In addition, Philip Palmer said that an examination of “the role of the CBC in news” is in order.[142]

Peter Menzies also supported the idea of having a national public broadcaster, particularly to serve remote regions and prevent them from becoming “news deserts.”[143] However, Mr. Menzies believes that CBC/Radio-Canada is currently “a publicly funded commercial broadcaster and online platform operator.”[144] In his opinion, CBC/Radio-Canada should “[g]et … out of the advertising business”[145] because it is distorting Canada’s news marketplace:

There will be no flourishing for news organizations until the CBC’s dualistic distortion of the marketplace is replaced with a level playing field. We will never have one of those, provided the CBC continues to compete for advertising revenue while being paid $1.3 billion a year by Parliament to be a public broadcaster.[146]

Jeff Elgie of Village Media Inc. also believes that CBC/Radio-Canada “competes with the private sector, for digital advertising in particular.”[147]

Online Safety

Social media platforms have facilitated communication and connectivity on a global scale. At the same time, they have enabled the spread of a range of harmful content that can pose significant threats to individual well-being, to public safety and even to the integrity of democratic institutions. Attempts to address online harms through legislation have gained traction in recent years, with the United Kingdom, the EU, and Australia, among other jurisdictions, enacting or advancing legislative frameworks to impose certain legal responsibilities upon online service platforms.

The U.K.’s Online Safety Act received Royal Assent on 26 October 2023.[148] It creates a new framework and series of regulations to address illegal, harmful, and unsafe online content, with the Office of Communications (OFCOM) as the regulator. It covers a wide range of content which may cause harm, including terrorism, racism, child sexual exploitation, suicide, eating disorders, misogyny and revenge pornography. The bill requires illegal content to be removed; places a legal responsibility on social media platforms to enforce their terms of service; and offer users options for filtering out content they do not wish to see.

The EU’s Digital Services Act (DSA) entered into force on 16 November 2022 and as of 17 February 2024 became fully applicable across the EU.[149] It imposes a series of obligations on providers including, among others, measures to counter illegal goods, services or content online; effective safeguards for users, including opportunities to challenge content moderation decisions; a ban on certain types of targeted advertising; and transparency measures on a variety of issues, including algorithms.

Canada’s own online safety legislation, Bill C-63, An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts (The Online Harms Act), was tabled in the House of Commons on 26 February 2024.[150]

According to the government, Bill C-63 “would create stronger online protection for children and better safeguard everyone in Canada from online hate and other types of harmful content.” It would “hold online platforms … accountable for the design choices made that lead to the dissemination and amplification of harmful content on their platforms and ensure that platforms are employing mitigation strategies that reduce a user’s exposure to harmful content.”[151]

The Bill would create a new legislative and regulatory framework mandating platforms to reduce the risk of harm for seven types of content:

  • a) Content that “sexually victimizes a child or revictimizes a survivor”;
  • b) Non-consensual communication of intimate content;
  • c) Violent extremist and terrorist content;
  • d) Content inciting violence;
  • e) Content fomenting hatred;
  • f) Bullying content directed at a child;
  • g) Content inducing a child to harm themselves.[152]

It also creates requirements for removing “content (1) that sexually victimizes a child or revictimizes a survivor, and (2) is intimate content posted without consent”; for providing accessible tools for flagging content and blocking users; and for implementing measures to protect children and “reduce exposure” to harmful content for everyone.[153]

The Bill also amends the Criminal Code and the Canadian Human Rights Act to address online hate and enhances the Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service.[154]

Finally, it establishes a Digital Safety Commission to oversee and enforce the Act as well as a Digital Safety Ombudsperson to “act as a resource and advocate for the public interest with respect to online safety.”[155]

Although the context of the present study was shaped by the Online News Act and Meta’s decision to stop sharing news links in Canada, some witnesses told the Committee about online harms and described similar tactics in the context of efforts to develop safety legislation such as the European Digital Services Act and the United Kingdom’s Online Safety Act.

Algorithms

Many observers have said that the very business model of online platforms has created an environment conducive to the spread of certain types of online harms, such as disinformation and conspiracy theories as well as self-harm, suicide and eating disorder content. Algorithms designed to keep users’ attention use machine-learning models to predict engagement and recommend content, and research has shown that they tend to amplify harmful content at higher rates than neutral content.[156] As such, a number of witnesses shared their concerns about the platforms’ use of algorithms with the committee.

Matthew Hatfield of OpenMedia identified “the engagement algorithms that drive [the platforms’] business model” as one of three main issues concerning the influence of tech platforms on society. He said, “without even noticing it, we’ve become a society in which most information we get is delivered because it keeps us scrolling and clicking, not because it is nuanced, well researched or true” and that in the area of news reporting, the process “is making us a less-informed, angrier and more polarized society.” He said it was “critical to get more transparency into how algorithms are working.”[157]

Dr. Donovan described the functioning of the algorithm in relation to harms to children: “Once you start looking at self-harm content and learn the keywords and the tricks of the trade, you can get into that world and the algorithm will continue to send you more of that content.”[158] She referred to research from Massachusetts Institute of Technology that showed that “novel and outrageous content moves further and faster online ... because of the way algorithms mediate our experience with the information we’re seeking.”[159]

Julie Kotsis of Unifor said that the platforms hide their “unique and ever-changing algorithms ... from users and regulators” and “protect [it] at all costs.”[160]

Imran Ahmed, from the Centre for Countering Digital Hate, said his organization had shown “a strong relationship between algorithms and the promotion of conspiracist and hateful content and disinformation.”[161] He agreed that algorithms are “fundamental to technology platforms’ business models” and that information about them is “hard to obtain” because of “commercial sensitivity.”[162]

Jean-Hugues Roy said that Canada “should give itself the right to access these companies’ databases and examine their algorithms” and that “the well-being of Canadians supersedes” the companies’ commercial interests.[163]

Mr. Ahmed said that algorithmic transparency would be a key ingredient in any “comprehensive” regulatory framework governing online platforms, calling transparency “the absolute bedrock of an effective mechanism for accountability.”[164] He noted that the EU was setting up the European Centre for Algorithmic Transparency in Seville to contribute to supervision and enforcement under the Digital Services Act.[165]

Disinformation and Conspiracy Theories

Witnesses described some of the major societal harms arising from unregulated social media platforms relying on algorithms to amplify content, among them disinformation and conspiracy theories.

According to UNESCO, misinformation refers to information that is false but not created with the intention of causing harm (the person who posts the information believes the falsehood to be true, for example); disinformation, on the other hand, is information that is deliberately false and created for the specific purpose to harm a person, social group, organization, or country.[166]

The Government of Canada notes that some individuals and groups create disinformation to promote political ideologies, including extremist views and conspiracy theories, or simply to make money (e.g., from ad views).[167] Disinformation creates “doubt and confusion” and can be particularly harmful when it involves health information; it can also cause financial harms as well as political polarization and distrust in key institutions.[168]

The prevalence of disinformation can be difficult to determine, but according to Statistics Canada, in 2022, 27% of Canadian users reported seeing “information suspected to be false or inaccurate” on a daily basis;[169] further, the Survey Series on People and their Communities showed 59% of Canadians to be “very or extremely concerned” about misinformation, with 43% saying they “felt it was getting harder to decipher online truth from fiction compared with three years earlier.”[170]

Imran Ahmed told the Committee that Instagram’s choice to add algorithmically-driven, unsolicited content to users’ feeds in 2020 had fueled an increase in harmful content, including disinformation: “Once the user exhausted the latest content from all accounts they follow, they gave new content as an extension of their feed, identifying users’ potential interests based on their data and habits. If they were looking at COVID-19 disinformation, it actually gave them QAnon and anti-Semitic disinformation. If they were looking at anti-Semitic users, they were being fed anti-vax and COVID-19 disinformation as well.”[171]

Asked about whether any particular groups benefited most from the algorithmic promotion of extremist or conspiracist content, Ms. Benavidez of Free Press told the Committee that “we have anecdotal evidence at best,” but that “far-right and extremist content has been engaged with over six times more than other content that is politically neutral.” She said it was important to be “very careful about ... the way we point to evidence, to make sure that we are not making claims we cannot support. When lawmakers say that X or Y type of content is boosted, we have to make sure we have evidence to show.”[172] Dr. Peinert said he could “say only anecdotally that it seems that far-right content is more prevalent” and that “companies don’t appear to care what the political orientation of that content is as long as they can keep users engaged.”[173]

Hate and Harassment

Journalists and media workers have been increasingly likely to find themselves the target of online harassment and abuse. According to a 2021 Ipsos survey, 65% of media workers had experienced some form of online harassment in the previous year, with women, racialized persons and 2SLGBTQI+ persons at a significantly greater risk.[174] Online harassment is largely “personal,” according to the survey, involving among other things “sexualized messages or images, physical threats, comments related to gender identity and ethnicity or nationality, and use of people’s names or images without their permission.”[175] One in ten respondents who experienced online harassment reported receiving death threats. Nearly as many reported threats against family, rape threats, and blackmail.[176]

Dr. Joan Donovan confirmed that “the harassment of women, women of colour and women who are journalists is almost of an epidemic proportion.” She explained that the use of hashtags to facilitate harmful behaviour “[creates] these online communities”[177] that perpetuate such harassment.

Ms. Julie Kotsis of Unifor noted that “a great deal” of the harassment and abuse on social media platforms “is aimed at journalists and media workers” and that a survey carried out by Unifor had shown that Twitter and Facebook were among the top sites for such content. She tied the topic of online harassment and abuse to the broader context for the present study, noting that “this is really about the ability of governments to enact meaningful rules and the willingness of tech giants to abide by those rules.”[178]

Harms to Children

Dr. Donovan told the committee about an Instagram whistleblower, Arturo Béjar, who testified before the U.S. Congress that the company was aware of harmful content being made available to children but would not make changes that would “affect the bottom line.” She said of content encouraging self-harm that the platforms “do try to tamp it down” but that it remains a “major issue” that platforms have “moral and ethical responsibilities” to address through better design. She told the committee that while “we have this perception that somehow moderation on platforms is censorship,” the function of moderation is to “[keep] spam out of your inbox and these bad actors from proliferating online.”[179]

Regulating Online Harms

Nora Benavidez said that the “largest tech companies have responded with disinterest” to “years of work by civil society, academics and lawmakers documenting social media harms and urging more accountability.” She said the platforms’ “failure to vet and remove content that violates their own stated terms of service harms and alienates users” and “inevitably also leads to the migration of lies and toxicity, from online platforms to mainstream media.” Ms. Benavidez also said that the companies have been “backsliding” on any efforts to improve safety:

In the last year alone, Meta, Twitter and YouTube have weakened their political ads policies, creating room for lies in ads ahead of next year’s elections around the world. They have weakened their privacy policies to give AI tools access to user data, and they’ve collectively laid off nearly 40,000 employees. Massive cuts have occurred across trust and safety teams, ethical engineering, responsible innovation and content moderation. Those are the teams tasked with maintaining a platform’s general health and protecting users from harm.[180]

Ms. Benavidez said that the companies “cannot be trusted to govern themselves” and have adopted “several new tactics ... to shut down inquiry and accountability,” including “cutting off researcher and API access to platform data” and taking legal action against researchers.[181]

Marc Hollin of Unifor echoed Mr. Ahmed’s observation that the platforms “want to appear to be passive entities, like a community bulletin board ... when in fact ... we know that the tech giants control, moderate and frankly profit from the transition of information and content in a myriad of ways.” He said that “falling back on the principle of platform accountability is the number one way” to make platforms deal responsibly with online content.[182]

Mr. Hollin said that to address the harassment of journalists and media workers online, the government should implement “tougher take-down requirements” for abusive content. He cited an example from other jurisdictions as “requiring the platforms, when there's a complaint, to act quickly—sometimes within 24 hours or faster—to take down online content that is hate-filled or harassing and abusive.”[183]

Pierre Trudel cited the Digital Services Act as offering some “interesting avenues” for addressing online harms and said companies like Meta “should be required to analyze” and manage systemic risk “with a view to eliminating, or at least severely reducing, the various forms of harassment.”[184]

Mr. Hatfield said that “greater researcher access” to platforms would help legislators and policymakers determine “to what extent they reflect society and to what extent they are amplifying or driving” certain kinds of content.[185]

Mr. Hatfield echoed other witnesses’ recommendations to emphasize platform accountability: “having platforms obligated to explain how they manage content and, really, to report to their users and to a regulator what they’re doing and the risks they think they’re mitigating.” He said this would create “some competitive pressure between different platforms to learn how to manage some of this better.”[186]

Mr. Ahmed said that companies would “find a way to squeeze out of taking responsibility and retaliate against anything that you do try,” and recommended developing a “comprehensive framework” that “includes safety by design, transparency of the algorithms, economics and a content-enforcement policy.”[187] He described the United Kingdom’s Online Safety Act as “elegant” in “working within the platform’s own community standards” to achieve accountability:

The British solution ... says, “You set your own rules, but we want to see whether or not you enforce them in the right way.” You don’t need to make it more complicated than saying, “if you act in a negligent way with respect to enforcing the rules that you tell others they have to abide by, and if that creates harm for our society, then we will impose significant economic consequences on you.”[188]

Several witnesses expressed concern about the implications of regulating harmful content for freedom of expression.

Professor Geist noted that there would be “real challenges around misinformation and disinformation from a regulatory perspective.”[189]

Philip Palmer told the Committee that “it is very difficult to be able to say when you shouldn’t be able to access certain information or certain services” and that the question of “safety” would pose a “dilemma that lawmakers and individuals” would “face constantly and chronically in this space.”[190]

Mr. Palmer told the Committee that the “boundary between awful and lawful” content would be a difficult question for any harms legislation, saying “there’s always a danger when we suppress speech—and particularly when government, which has an interest in how people speak and, particularly speak about it, is empowered to suppress elements of speech.” He described the boundary as constituting “a tremendously dangerous line on which government has to be respectful of rights.”[191]

Mr. Hatfield echoed Mr. Palmer’s concerns about censorship, saying it was “critical that we don’t create a very censorious situation where the government indirectly forces platforms to remove a ton of lawful speech.” He said there was “potential for really critical social mobilization and conversations to be affected if we set out poorly designed regulation.”[192] He told the Committee it would be “quite dangerous” to “[have] the government in the position of deciding that people shouldn’t be expressing themselves”[193] in certain ways.

Recommendations on Regulating the Technology Sector

Over the course of their testimony, witnesses made several recommendations for the Committee’s consideration in respect of regulating companies like Google and Meta.

Jean-Hugues Roy said Canada should have “the means to acquire ... information” on the companies, such as “detailed financial statements,” and that Canada should “give [itself] more resources”:

In order to protect citizens, governments have given themselves the right to see how certain companies handle food, for example. They have given themselves the right to inspect aircraft and search travellers’ luggage. … The time has come for Canada to give itself the right to inspect what information these companies possess about Canadian citizens.[194]

Marc Hollin echoed Mr. Roy’s comments, saying, “[i]t’s really a question of national sovereignty and the right of legislators and citizens to enact rules of self-governance and to expect any entity that exists within that area to abide by them.”[195]

Several witnesses commented on how lawmakers had fallen behind as the tech giants rose to prominence. Pierre Trudel told the Committee that “we’ve lost decades by doing nothing about the tech giants, by taking a romantic view of the marvels of the Internet.”[196] Georg Riekeles said, “public action has consistently been too little and too late,”[197] in part because “the Internet came to us with the idea that Internet equals democracy” and because “a lot of the ideology that accompanies it has been very strong in convincing lawmakers and policy-makers that one could live by a self-regulatory model.”[198]

Dr. Donovan pointed out to the Committee that “technology is the policy. It’s not that we have an absence of regulation, but the technology arrives in the world, and if we fail to regulate it, it exists and makes its own policy … [I]t becomes very hard for regulators to come in a year, two years or 10 years after a product has been on the market and say, ‘Wait. Now we understand the harms and we want to do something about them.’”[199] Likewise, Mr. Trudel said that “the practices of multinational firms, and their various technical configurations, establish regulations by default.”[200]

Bram Vranken recommended “[protecting] the decision-making process from privileged access by big tech, for example, by limiting the access these companies have to decision-makers. At the same time, policymakers should reach out to those who do not have the resources to make themselves heard, such as SMEs, civil society, independent researchers and local groups.”[201] Mr. Riekeles emphasized the need for reporting on influence “project by project, euro by euro. Interference strategies need to be systematically monitored and counted.”[202]

Philip Palmer told the Committee that “Canada is too small in population and in wealth to establish the norms by which the Internet will be regulated or how Internet service providers will govern themselves. If Canada overreaches and imposes unrealistic economic and social costs on Internet services, it may find its businesses and its citizens cut off from the services and knowledge that are available to its peers.”[203]

Several witnesses recommended studying examples of regulation from other jurisdictions. Mr. Palmer said, “There are a number of experiments under way in democratic societies that deal with Internet and tech regulation that Canada can learn from, emulate or cooperate with.”[204] Mr. Trudel also recommended “[paying] attention to methods being used and regulations being implemented by other democratic countries,” not least because “we are dealing with multinationals that operate around the world.”[205] He recommended an “urgent intensification of collaborative work with other countries.”[206]

Recommendations

The Committee recommends:

Recommendation 1

That digital content platforms put mechanisms in place to detect undesirable or questionable content that may be the product of disinformation or foreign interference, and that these platforms be required to promptly identify such content and report it to users; failure to do so should result in penalties.

Recommendation 2

That the Government of Canada develop an extensive information and awareness campaign on the dangers of disinformation, as well as on how to detect and protect against it.

Recommendation 3

That the Government of Canada require digital platforms to collaborate with independent academic research by providing, upon request and by any means deemed appropriate, the data needed to understand our digital ecosystem, particularly with regard to the way that exposure to harmful content affects vulnerable people, such as children.

Recommendation 4

An online communication service provider must take measures to ensure that the procedures, practices, rules and systems, including algorithms, put in place for the purpose of moderating content that is communicated on its online communication service and that is accessible to individuals in Canada, do not result in adverse differential treatment of any individual or group of individuals based on one or more prohibited ground of discrimination.

Recommendation 5

That the Government of Canada make changes to the Income Tax Act, specifically to rules that allow advertising purchased by businesses on foreign websites to be counted as a fully deductible expense, while restrictions remain for deducting the cost of advertising with Canadian media.


[1]                House of Commons, Standing Committee on Canadian Heritage (CHPC), Minutes of Proceedings, 20 March 2023.

[2]                Public Policy Forum, The Shattered Mirror: News, Democracy and Trust in the Digital Age, 2017.

[3]                House of Commons Standing Committee on Canadian Heritage, Disruption: Change and Churning in Canada’s Media Landscape, June 2017, p. 73.

[4]                Online News Act (S.C. 2023, c. 23).

[6]                Standing Senate Committee on Transport and Communication (TRCM), Sixth Report, 14 June 2023.

[7]                See Laurence Brosseau, Gabrielle de Billy Brown, and Marion Ménard, Legislative Summary of Bill C-18, An Act respecting online communications platforms that make news content available to persons in Canada, Library of Parliament, 44-1-C18-E, 13 October 2022.

[8]                Ibid.

[10]              Government of Canada, Canada Gazette, Part 2, Volume 158, Number 1: Online News Act Application and Exemption Regulations, 15 December 2023.

[11]              Ibid.

[12]              Canadian Radio-Television and Telecommunications Commission (CRTC), CRTC launches public consultation to implement the Online News Act, 13 March 2024.

[14]              CHPC, Evidence, 18 October 2022, 1125.

[15]              CHPC, Evidence, 18 October 2022, 1125 (Colin McKay, Head, Public Policy and Government Relations, Google Canada).

[16]              Ibid.

[17]              Ibid.

[18]              Ibid.

[19]              Ibid.

[20]              Reuters, “Google tests blocking news content for some Canadians,” 22 February 2023.

[21]              CHPC, Minutes of Proceedings, 28 February 2023.

[22]              CHPC, Evidence, 10 March 2023, 1305 (Jason Kee, Public Policy Manager, Google Canada).

[23]              Kent Walker, “An update on Canada’s Bill C-18 and our Search and News products,” Google Canada Blog, 29 June 2023.

[24]              Department of Canadian Heritage,Statement by Minister St-Onge on next steps for the Online News Act, 29 November 2023.

[25]              Peter Zimonjic and Louis Blouin, “Almost two-thirds of Google’s $100 million media fund will go to print, digital media,CBC News, 15 December 2023.

[26]              Meta, “Sharing Our Concerns With Canada’s Online News Act,” 21 October 2022.

[27]              Ibid.

[28]              Ibid.

[29]              CHPC, Evidence, 28 October 2022, 1300.

[30]              “Meta to block access to news on Facebook, Instagram if Online News Act adopted as-is,” The Canadian Press, 12 March 2023.

[31]              CHPC, Evidence, 8 May 2023, 1130 (Kevin Chan, Global Policy Director, Meta Platforms Inc.).

[32]              Meta, “Changes to News Availability on Our Platforms in Canada”, 1 June 2023.

[33]              Ibid.

[34]              Ibid.

[35]              Ibid.

[36]              Marie Woolf, “Ottawa makes overture to Meta to restart talks in hope of ending news block,” The Globe and Mail, 1 December 2023 [subscription required].

[37]              CHPC, Minutes of Proceedings, 28 February 2023.

[38]              CHPC, Minutes of Proceedings, 20 March 2023.

[39]              CHPC, Evidence, 20 April 2023, 1600 (Kent Walker, President, Global Affairs, Google LLC).

[40]              Ibid.

[41]              CHPC, Evidence, 20 April 2023, 1555 (Kent Walker).

[42]              CHPC, Evidence, 20 April 2023, 1535 (Richard Gingras, Vice-President, News, Google LLC).

[43]              CHPC, Evidence, 20 April 2023, 1540 (Richard Gingras).

[44]              CHPC, Evidence, 20 April 2023, 1535 (Richard Gingras).

[45]              CHPC, Evidence, 20 April 2023, 1610 (Richard Gingras).

[46]              CHPC, Evidence, 20 April 2023, 1550 (Kent Walker).

[47]              CHPC, Evidence, 20 April 2023, 1545 (Richard Gingras).

[48]              CHPC, Evidence, 20 April 2023, 1530 (Kent Walker).

[49]              CHPC, Evidence, 20 April 2023, 1700 (Kent Walker).

[50]              Edward T. Walker et al., “Poisoning the Well: How Astroturfing Harms Trust in Advocacy Organizations,” Social Currents 10(2), 22 October 2022.

[51]              CHPC, Evidence, 20 April 2023, 1645 (Kent Walker).

[52]              CHPC, Evidence, 20 April 2023, 1705 (Kent Walker).

[53]              CHPC, Evidence, 20 April 2023, 1700 (Kent Walker).

[54]              CHPC, Evidence, 8 May 2023, 1125 (Kevin Chan).

[55]              Ibid.

[56]              Meta blocked news in Australia during deliberations on that country’s News Media Bargaining Code. Non-news sites such as government sites, emergency services, and charities were also affected by the blackout. Whistleblowers from inside the company have said that the blocking of non-news sites was deliberate. See Josh Taylor, “Deliberate ploy: whistleblowers reveal why Facebook’s Australia news ban included non-news sites,” The Guardian, 28 May 2022.

[57]              CHPC, Evidence, 8 May 2023, 1150 (Rachel Curran, Head of Public Policy, Canada, Meta Platforms Inc.).

[58]              CHPC, Evidence, 8 May 2023, 1225 (Rachel Curran).It is worth noting that as of 1 March 2024, Meta announced that it would no longer pay Australian news publishers for news content shared on Facebook. See Byron Kaye and Lewis Jackson, “Facebook owner Meta angers Australia with plan to stop paying for news content,” Reuters, 1 March 2024.

[59]              CHPC, Evidence, 8 May 2023, 1225 (Rachel Curran).

[60]              CHPC, Evidence, 8 May 2023, 1235 (Rachel Curran).

[61]              CHPC, Evidence, 8 May 2023, 1155 (Kevin Chan).

[62]              CHPC, Evidence, 8 May 2023, 1230 (Kevin Chan).

[63]              CHPC, Evidence, 8 May 2023, 1235 (Kevin Chan).

[64]              CHPC, Evidence, 8 May 2023, 1210 (Kevin Chan).

[65]              CHPC, Evidence, 8 May 2023, 1215 (Kevin Chan).

[66]              CHPC, Evidence, 8 May 2023, 1230 (Rachel Curran).

[67]              CHPC, Evidence, 8 May 2023, 1150 (Rachel Curran).

[68]              Ibid.

[69]              CHPC, Evidence, 8 May 2023, 1205 (Kevin Chan).

[70]              CHPC, Evidence, 8 May 2023, 1220 (Rachel Curran).

[71]              CHPC, Evidence, 14 December 2023, 0830 (Georg Riekeles, Associate Director, European Policy Centre, As an Individual).

[72]              CHPC, Evidence, 14 December 2023, 0825 (Bram Vranken, Researcher, Corporate Europe Observatory).

[73]              CHPC, Evidence, 28 November 2023, 1100 (Imran Ahmed, Chief Executive Officer, Center for Countering Digital Hate).

[74]              CHPC, Evidence, 5 December 2023, 1120 (Dr. Courtney Radsch, Director, Center for Journalism and Liberty, Open Markets Institute).

[75]              CHPC, Evidence, 5 December 2023, 5655 (Dr. Erik Peinert, Research Manager, American Economic Liberties Project).

[76]              Reuters, Google U.S. lobbying jumps 27% as lawmakers aim to rein in Big Tech, 20 January 2022.

[77]              Adam Satariano et al., Big Tech Turns Its Lobbyists Loose on Europe, Alarming Regulators, New York Times, 14 December 2020.

[78]              See European Parliament, Digital Services Act and aggressive lobbying by Google, Parliamentary question – E‑000162/2021, 13 January 2021.

[79]              CHPC, Evidence, 14 December 2023, 0830 (Georg Riekeles).

[80]              CHPC, Evidence, 28 November 2023, 1120 (Jason Kint, Chief Executive Officer, Digital Content Next).

[81]              CHPC, Evidence, 14 December 2023, 0825 (Bram Vranken).

[82]              Ibid.

[83]              Ibid.

[84]              CHPC, Evidence, 14 December 2023, 0830 (Georg Riekeles).

[85]              CHPC, Evidence, 14 December 2023, 0945 (Georg Riekeles).

[86]              CHPC, Evidence, 5 December 2023, 1120 (Dr. Courtney Radsch).

[87]              CHPC, Evidence, 28 November 2023, 1125 (Dr. Michael Geist, Canada Research Chair in Internet and E-Commerce Law, Faculty of Law, University of Ottawa, As an Individual).

[88]              CHPC, Evidence, 14 December 2023, 0815 (Dr. Joan Donovan, Online Disinformation and Misinformation Export, Boston University College of Communication, As an Individual). Frances Haugen is a former Facebook employee who in 2021 shared confidential documentation with journalists, lawmakers and regulators on the company’s failure to implement safeguards for its users.

[89]              Ibid.

[90]              CHPC, Evidence, 28 November 2023, 1230 (Jason Kint). For details, see Shannon Bond, “NYU Researchers Were Studying Disinformation on Facebook. The Company Cut Them Off,” NPR, 4 August 2021.

[91]              CHPC, Evidence, 5 December 2023, 1250 (Dr. Courtney Radsch).

[92]              CHPC, Evidence, 14 December 2023, 0835 (Matthew Hatfield, Executive Director, OpenMedia).

[93]              CHPC, Evidence, 5 December 2023, 5700 (Nora Benavidez, Senior Counsel and Director of Digital Justice and Civil Rights, Free Press).

[94]              “About Us,” Tech Transparency Project.

[95]              “Zuckerberg and Meta Reach Deep into Academia,” Tech Transparency Project, 6 December 2023.

[96]              Mohammad Abdalla et al., “The Grey Hoodie Project: Big Tobacco, Big Tech, and the Threat on Academic Integrity,” Proceedings of the 2021 AAAi/ACM Conference on AI, Ethics, and Society, July 2021, p. 6.

[97]              CHPC, Evidence, 14 December 2023, 0900 (Dr. Joan Donovan).

[98]              Ibid.

[99]              Ibid.

[100]           CHPC, Evidence, 28 November 2023, 1125 (Dr. Michael Geist).

[101]           CHPC, Evidence, 5 December 2023, 1120 (Dr. Courtney Radsch).

[102]           CHPC, Evidence, 14 December 2023, 0830 (Georg Riekeles).

[103]           Ibid.

[104]           CHPC, Evidence, 28 November 2023, 1120 (Jason Kint).

[105]           Ibid.

[106]           Ibid.

[107]           CHPC, Evidence, 14 December 2023, 0845 (Philip Palmer, President, Internet Society Canada Chapter).

[108]           CHPC, Evidence, 5 December 2023, 5715 (Sean Speer, Editor-at-large, The Hub).

[109]           CHPC, Evidence, 28 November 2023, 1135 (Dr. Michael Geist).

[110]           CHPC, Evidence, 14 December 2023, 0835 (Matthew Hatfield).

[111]           CHPC, Evidence, 28 November 2023, 1155 (Dr. Michael Geist).

[112]           CHPC, Evidence, 14 December 2023, 0840 (Jeff Elgie, Chief Executive Officer, Village Media Inc.).

[113]           CHPC, Evidence, 5 December 2023, 5655 (Dr. Erik Peinert). For reference, see “Google should pay millions for Swiss news, says study,” Swissinfo.ch, 17 March 2023.

[114]           CHPC, Evidence, 5 December 2023, 1150 (Dr. Courtney Radsch). For reference, see Matthew Ingram, “How much do Google and Meta owe publishers? Twelve billion dollars, a new study says,” Columbia Journalism Review, 16 November 2023.

[115]           CHPC, Evidence, 14 December 2023, 0940 (Jeff Elgie and Philip Palmer).

[116]           CHPC, Evidence, 14 December 2023, 0845 (Philip Palmer).

[117]           CHPC, Evidence, 14 December 2023, 0840 (Jeff Elgie).

[118]           CHPC, Evidence, 5 December 2023, 5710 (Sean Speer).

[119]           CHPC, Evidence, 28 November 2023, 1200 (Dr. Michael Geist).

[120]           CHPC, Evidence, 28 November 2023, 1120 (Jason Kint).

[121]           CHPC’ Evidence, 28 November 2023, 1230 (Jason Kint).

[122]           CHPC, Evidence, 14 December 2023, 0840 (Jeff Elgie).

[123]           CHPC, Evidence, 5 December 2023, 1105 (Peter Menzies, As an Individual).

[124]           CHPC, Evidence, 14 December 2023, 0855 (Matthew Hatfield).

[125]           CHPC, Evidence, 5 December 2023, 5655 (Dr. Erik Peinert).

[126]           CHPC, Evidence, 5 December 2023, 1140 (Dr. Courtney Radsch).

[127]           Ibid.

[128]           CHPC, Evidence, 5 December 2023, 1105 (Peter Menzies).

[129]           CHPC, Evidence, 14 December 2023, 0850 (Philip Palmer).

[130]           CHPC, Evidence, 14 December 2023, 0850 (Jeff Elgie).

[131]           CHPC, Evidence, 28 November 2023, 1225 (Dr. Michael Geist).

[132]           Ibid.

[133]           Standing Senate Committee on Transport and Communications, The Tax Deductibility of Foreign Internet Advertising in Canada, August 2018, p. 12.

[135]           CHPC, Evidence, 5 December 2023, 5820 (Erik Peinert).

[136]           CHPC, Evidence, 5 December 2023, 1225 (Pierre Trudel, Professor, Public Law Research Center, Université de Montréal, Law School, As an Individual).

[137]           CHPC, Evidence, 28 November 2023, 1210 (Jean-Hugues Roy, Professor, École des médias, Université du Québec à Montréal, As an Individual).

[138]           CHPC, Evidence, 5 December 2023, 1200 (Marc Hollin, National Representative, Unifor).

[139]           CHPC, Evidence, 5 December 2023, 1250 (Courtney Radsch).

[140]           CHPC, Evidence, 28 November 2023, 1210 (Dr. Michael Geist).

[141]           CHPC, Evidence, 5 December 2023, 1245 (Pierre Trudel).

[142]           CHPC, Evidence, 14 December 2023, 0850 (Philip Palmer).

[143]           CHPC, Evidence, 5 December 2023, 1205 (Peter Menzies).

[144]           CHPC, Evidence, 5 December 2023, 1105 (Peter Menzies).

[145]           Ibid.

[146]           Ibid.

[147]           CHPC, Evidence, 14 December 2023, 0855 (Jeff Elgie).

[148]           United Kingdom, Online Safety Act 2023.

[149]           European Commission, The Digital Services Act.

[152]           Ibid.

[153]           Ibid.

[154]           Ibid.

[155]           Ibid.

[156]           Congressional Research Service, Social Media Algorithms: Content Recommendation, Moderation, and Congressional Considerations, 27 July 2023.

[157]           CHPC, Evidence, 14 December 2023, 0835 (Matthew Hatfield).

[158]           CHPC, Evidence, 14 December 2023, 0910 (Dr. Joan Donovan).

[159]           CHPC, Evidence, 14 December 2023, 0935 (Dr. Joan Donovan).

[160]           CHPC, Evidence, 5 December 2023, 1125 (Julie Kotsis, Media Representative, National Executive Board, Unifor).

[161]           CHPC, Evidence, 28 November 2023, 1110 (Imran Ahmed).

[162]           Ibid.

[163]           CHPC, Evidence, 28 November 2023, 1115 (Jean-Hugues Roy).

[164]           CHPC, Evidence, 28 November 2023, 1250 (Imran Ahmed).

[165]           Ibid.

[166]           United Nations Educational, Scientific and Cultural Organization (UNESCO), Journalism, ‘Fake News’ and Disinformation: A Handbook for Journalism Education and Training.

[167]           Government of Canada, Learn about online disinformation, 11 January 2024.

[168]           Government of Canada, Online disinformation, 29 January 2024.

[169]           Statistics Canda, Online safety in Canada, 2022, 20 July 2023.

[170]           Statistics Canada, The Daily — Concerns with misinformation online, 2023, 20 December 2023.

[171]           CHPC, Evidence, 28 November 2023, 1110 (Imran Ahmed).

[172]           CHPC, Evidence, 5 December 2023, 5745 (Nora Benavidez).

[173]           CHPC, Evidence, 5 December 2023, 5750 (Dr. Erik Peinert).

[174]           Ipsos, Online Harm in Journalism, 8 November 2021, p. 4.

[175]           Ibid., p. 5.

[176]           Ibid., p. 18.

[177]           CHPC, Evidence, 14 December 2023, 0940 (Dr. Joan Donovan).

[178]           CHPC, Evidence, 5 December 2023, 1125 (Julie Kotsis).

[179]           CHPC, Evidence, 14 December 2023, 0910 (Dr. Joan Donovan).

[180]           CHPC, Evidence, 5 December 2023, 5700 (Nora Benavidez).

[181]           Ibid.

[182]           CHPC, Evidence, 5 December 2023, 1200 (Marc Hollin).

[183]           Ibid.

[184]           CHPC, Evidence, 5 December 2023, 1225 (Pierre Trudel).

[185]           CHPC, Evidence, 14 December 2023, 1010 (Matthew Hatfield).

[186]           CHPC, Evidence, 14 December 2023, 1015 (Matthew Hatfield).

[187]           CHPC, Evidence, 28 November 2023, 1135 (Imran Ahmed).

[188]           CHPC, Evidence, 28 November 2023, 1220 (Imran Ahmed).

[189]           CHPC, Evidence, 28 November 2023, 1215 (Dr. Michael Geist).

[190]           CHPC, Evidence, 14 December 2023, 0955 (Philip Palmer).

[191]           CHPC, Evidence, 14 December 2023, 1010 (Philip Palmer).

[192]           CHPC, Evidence, 14 December 2023, 1010 (Matthew Hatfield).

[193]           CHPC, Evidence, 14 December 2023, 1015 (Matthew Hatfield).

[194]           CHPC, Evidence, 28 November 2023, 1110 (Jean-Hugues Roy).

[195]           CHPC, Evidence, 5 December 2023, 1200 (Marc Hollin).

[196]           CHPC, Evidence, 5 December 2023, 1150 (Pierre Trudel).

[197]           CHPC, Evidence, 14 December 2023, 0930 (Georg Riekeles).

[198]           CHPC, Evidence, 14 December 2023, 1000 (Georg Riekeles).

[199]           CHPC, Evidence, 14 December 2023, 0905 (Dr. Joan Donovan).

[200]           CHPC, Evidence, 5 December 2023, 1110 (Pierre Trudel).

[201]           CHPC, Evidence, 14 December 2023, 0825 (Bram Vranken).

[202]           CHPC, Evidence, 14 December 2023, 0945 (Georg Riekeles).

[203]           CHPC, Evidence, 14 December 2023, 0845 (Philip Palmer).

[204]           CHPC, Evidence, 14 December 2023, 0850 (Philip Palmer).

[205]           CHPC, Evidence, 5 December 2023, 1245 (Pierre Trudel).

[206]           CHPC, Evidence, 5 December 2023, 1150 (Pierre Trudel).