:
I call this meeting to order.
Good morning, everyone, and welcome to meeting number 110 of the House of Commons Standing Committee on Industry and Technology.
Today’s meeting is taking place in a hybrid format, pursuant to the Standing Orders.
Pursuant to the order of reference of Monday, April 24, 2023, the committee is resuming consideration of Bill .
I'd like to welcome all the witnesses here today.
From the Alliance of Canadian Cinema, Television and Radio Artists, we have Eleanor Noble, national president, who is joined by Marie Kelly, national executive director. We also have Stéphanie Hénault, director of legal affairs at the Association nationale des éditeurs de livres, as well as Marie-Julie Desrochers, executive director of the Coalition for the Diversity of Cultural Expression. From the Directors Guild of Canada, we have Dave Forget, national executive director, and Samuel Bischoff, manager of policy and regulatory affairs. Lastly, from Music Canada, we have Patrick Rogers, chief executive officer.
[English]
Thanks for being here with us this Monday morning as we're nearing the end of this study on Bill . You are generating, as I see already, a lot of excitement in the room, so thanks for making the time to enlighten us with your testimony and your answers to our questions today.
Without further ado, we can start with Ms. Noble for five minutes.
The floor is yours.
I'm Eleanor Noble, the national president of the Alliance of Canadian Cinema, Television and Radio Artists.
Thank you for the opportunity to speak to this committee on behalf of the 30,000 members of our union. With me today is Marie Kelly, our national executive director. She's here with me to address any questions you may have.
For 80-plus years, ACTRA has represented professional performers across Canada who bring Canadian stories to life. We play a vital role in a nearly $14-billion industry that generates 240,000 jobs a year. We came to this committee today because we are concerned about the use of artificial intelligence and similar technologies in our industry.
To be clear, there are some positives to the adoption of technology in our industry when used responsibly. That said, our members are increasingly concerned about the unbridled and unmitigated use of AI in our industry and outside of it, which has the potential to significantly and harmfully impact our ability to work and make a living in the screen industry.
Last year, we undertook a comprehensive survey of our members about the impact of AI. Outside of collective bargaining, we have never had more responses to a member survey. Let me share with you some of the high-level takeaways: 98% of ACTRA performers are concerned about the potential misuse of their name, image and likeness by AI; 93% of respondents are concerned that AI will eventually replace human actors, beginning with background work and dubbing.
We have seen real examples of harmful use already. It was brought to ACTRA's attention last year that the voice of one of ACTRA's minor performers—underage performers—was uploaded to an AI text-to-speech voices list on a public website that allowed users to manipulate her voice to say crude, R-rated things. This is a minor, I'll remind you. This is unacceptable.
Similarly, an ACTRA performer on a video game was downloaded by players and, with the use of AI, their voice and game character were manipulated to say obscene things and to perform sexual acts, all without the knowledge or consent of the actor. This was accessible online for two years before the actor became aware, at which point ACTRA was contacted to step in.
These are just two examples of the harmful manipulations that performers—and, frankly, many Canadians—face. I think we can agree as Canadians that these are extremely harmful violations. We—you—have an opportunity to take action in this bill to protect us, and we are asking you to do so.
We are pleased that the government is reviewing the impact of AI in a multi-faceted manner. We believe it's important to update the privacy regime in Canada and to put a framework in place to ensure AI developers and deployers must take action to mitigate the potential for harm from their technologies.
We want to congratulate the government on bringing Bill forward and, in particular, we support your intention to ensure that consent is required for the use of biometric information, including a performer's name, image and likeness. Clarity around informed consent, we hope, will help in our work to ensure the industry does not ambush performers into signing away their rights.
This committee must push this bill further to clarify the type of harm performers experience on an all-too-regular basis. Not only is AI causing personal harm to performers like me, but it also risks our livelihoods and reputations. In the entertainment business, our reputation—including our name, image and likeness—is all we have. We are the brand, which we protect. The difference between getting a job one day and not getting one the next can come down to the most minute things, including one's reputation.
Sadly, reputational harm is not currently encompassed by Bill . The definition of harm to include “psychological harm” or “economic loss to an individual” does not sufficiently encompass the reputational harm we experience. Due to the nature of our business, we might not be able to show an exact circumstance of work lost due to a deepfake or manipulation, but there is no doubt that damage to a performer's reputation means real and tangible loss for our careers.
We have submitted to this committee our proposed language to rectify this gap under the legislation. We strongly urge this committee to amend the definition of harm to ensure that performers' rights are protected under this bill.
Finally, the government must take action to amend other statutes to mitigate the harm of AI on Canadian performers. Specifically, we believe that the Copyright Act is fundamentally biased against performers by not ascribing a moral right to their work. We urge this committee to take action, either through this bill or with haste elsewhere, to protect Canadian performers. We understand that the upcoming budget bill may contain amendments to the Copyright Act, and we ask that you raise with the the urgency of the need to provide moral rights to performers in it, as musicians have.
Committee members, we recognize that this bill is only scratching the surface of the public policy tools the government has on this file. We urge you to take us seriously. Our sector is an economic driver in this country, with real workers who strive to make a living and contribute to our Canadian cultural life. We need you, as legislators, to ensure that we can be protected and can continue to work today and into the future.
Thank you. Marie and I would be happy to take any of your questions.
:
On behalf of the Association nationale des éditeurs de livres, I want to thank you for having me here in connection with this study on the first legislative initiative to specifically regulate artificial intelligence systems in Canada.
My name is Stéphanie Hénault, and I am the director of legal affairs at the association, which represents francophone book publishing companies across the country. Together with the Union des écrivaines et des écrivains québécois, the association that represents authors in Quebec, we have established Copibec–Gestion collective des droits de reproduction, which offers copyright and royalty management solutions for users and rights holders. Associated with the International Publishers Association, the largest publishing federation in the world, we promote publishing as an economic, cultural and social development driver and are leaders in its evolution. By collectively participating in major international fairs and salons, hosting foreign publishers, booksellers and journalists here in Canada and taking part in numerous foundational projects, we are involved in numerous efforts to promote the exposure of French Canadian books.
For example, we established the Entrepôt ANEL-De Marque, which has fostered the successful development of a business model that complements that of the print industry, and we support our members in implementing digital strategies that promote their development. More specifically, we promote books in all formats in francophone countries and the translation of those books in countries such as Germany, Argentina, China, Egypt, Spain, the United States, Mexico, Iceland, Sweden, Serbia and Turkey, to name only a few.
The more Canadian literature is read internationally, the more popular it becomes among readers. The more often it's noticed by juries, the more awards it wins and the more it sells on all continents, including in our own country. The following numbers show how successful French Canadian books have become. In Quebec alone, sales of new books represent a market valued at approximately $680 million a year. Also in Quebec, the market share of francophone publishing companies represents 50% of sales, even though 900 foreign publishers distribute their books here.
In the artificial intelligence era, the entire Canadian book publishing industry needs our support, now more than ever, in establishing updated policies and programs by encouraging the lawful supply of content in this field.
This is why we took an active part in the recent consultation on generative AI and copyright by supporting the responsible development of artificial intelligence. We did it with Access Copyright, the Association of Canadian Publishers, the Association des éditeurs de langue anglaise du Québec, the Canadian Authors Association, the Canadian Publishers' Council, Copibec, the Literary Press Group of Canada, the Regroupement des éditeurs franco-canadiens, the Writers' Union of Canada, the Union des écrivaines et des écrivains québécois, as well as with our partners in the Coalition for the Diversity of Cultural Expressions.
The global publishing industry relies on copyright, particularly the exclusive right to authorize or prohibit the use of works and to engage in fee-based licensing. These rights are engaged when works are integrated in AI systems and when those systems are used if works are reproduced within them. For rights holders, the ability to grant or withhold permission to use works in these ways is as important as the compensation that may follow therefrom, particularly when a production of an artificial intelligence system competes with the work, substitutes it or undermines the author's moral right, to name only those forms of harm.
In the British, European and North American markets, we are seeing increasing numbers of copyright violation actions against AI models and trade agreements that are being reached to allow content to be licensed for text and data search purposes.
In Canada, licensing for text and data search is a growing market. We implore the government, on behalf of the book publishing industry, to encourage that industry by amending part 3 of Bill such that it clearly establishes that artificial intelligence must be developed and deployed responsibly, in the following manner: first, by implementing procedures that guarantee compliance with copyright legislation when its models are trained; second, by establishing obligations of transparency in the publication and availability of information on content integrated in its systems; and, lastly, by clearly and expressly stating in its own conditions of licensing with its users that the latter are required to comply with copyright.
The Copyright Act affords copyright holders remedies for addressing counterfeit cases involving AI developers, suppliers and users. First, however, AI framework legislation must at least provide that the intellectual property of Canadians be respected. Otherwise, the Canadian royalties market could well be hit even harder as systems will be developed and deployed secretly, unfairly and unlawfully.
Let me be very clear: we are not opposed to artificial intelligence, but we do contend that all Canadian market actors must support the legitimate interests of authors and publishers, as well as their essential contribution to innovation, knowledge, culture, diversity, cultural outreach, the economy and wealth of the country. We therefore emphasize that you must ensure our country at least complies with international practices respectful of authors and publishers, as Europe is doing with its new AI legislation, to prevent Canada from looking like a banana republic of international technology companies.
In conclusion, I want to emphasize that authors and publishers are also counting on you to improve the Copyright Act, failing which they will be unable to receive the legitimate royalties that their international counterparts receive when their works are reproduced at certain educational institutions. I would also remind you that this priority was supported by the Standing Committee on Science and Research in its November 2023 report entitled Support for the Commercialization of Intellectual Property.
On behalf of the Association nationale des éditeurs de livres, thank you very much for listening. I will be pleased to answer your questions.
:
Mr. Chair and members of the committee, thank you for your invitation and for this opportunity for the cultural sector to comment on Bill .
I am the executive director of the Coalition for the Diversity of Cultural Expressions, which this year celebrates its 25th anniversary. The coalition consists of more than 50 members from Canada's cultural sector: anglophone and francophone unions, professional associations and collection societies. We cover a broad and diverse range of audiovisual, musical, digital arts, book and publishing disciplines, as well as the visual and performing arts. We also represent more than 350,000 creators and nearly 3,000 businesses in the cultural industry.
I'm in good company today, surrounded by three coalition members: the Association nationale des éditeurs de livres du Québec, the Directors Guild of Canada and the Alliance of Canadian Cinema, Television and Radio Artists. This small sample represents only part of the impact that the development of artificial intelligence has had on our sector. I encourage you to continue consulting the cultural sector so you can also hear from the representatives of visual artists, screenwriters, producers, composers, authors and others.
Our coalition's primary mission is to secure a cultural exclusion in trade agreements in order to preserve Canada's cultural sovereignty. We also want to ensure that Canada adopts public policies that guarantee protection and promotion for the diversity of cultural expressions, including in the digital environment. Our efforts build on the 2005 Convention on the Protection and Promotion of the Diversity of Cultural Expressions. That UNESCO convention came to be as a result of the concerted efforts of Quebec and Canada, and France as well, and I would note that Canada was the first country to ratify it.
We are here today to comment on a bill that it is designed to protect Canadians from the risks presented by the spectacular developments in artificial intelligence, generative AI in particular.
The 2005 convention states that cultural diversity is "indispensable for peace and security at the local, national and international levels". In other words, the development of responsible artificial intelligence must take that diversity into account and ensure it is protected. Diversity is essential in safeguarding our freedom of expression, the health of our democracy and the maintenance of our sovereignty.
Bill C-27 essentially addresses the risks facing individuals as a result of artificial intelligence. As others before us have done, we wish to emphasize how important it also is to consider the societal risks that artificial intelligence presents.
The purpose of the new legislation, stated in clause 4, and the definition of harm that appears in the text are too limited. Adopting wording found in the European Union's AI legislation, we suggest that one of the purposes of the new act be to protect the health, safety and fundamental Charter rights, including democracy—of which the diversity of cultural expressions is a pillar—and the rule of law, as well as the environment, from the harmful effects of AI systems.
The main theme for today's witnesses is copyright. That's heartening because we are convinced that Bill C-27 has a major role to play in this area.
The Canadian government recently conducted a consultation on the impact of generative AI on copyright. The cultural community's unanimous view is that, contrary to the widely held perception, Canadian copyright legislation doesn't need to be significantly modernized to protect rights holders in reaction to developments in generative AI. It already protects human creation and prohibits the unauthorized use of protected cultural content. However, as a result of a lack of transparency regarding the data used to drive AI systems, that act cannot be applied in an optimal fashion. This is where Bill C-27 must play a role.
Here are two specific potential solutions that would restore the Copyright Act to full effectiveness for the benefit of rights holders and Canadians as well.
We should draw on European AI legislation and go beyond the obligation to retain data records, as provided in the new subsection 7(2) proposed by amendment to Bill C-27 and, in particular, provide that a sufficiently detailed summary of the use of copyright-protected training data is made available to the public.
It should also be more clearly stated that Bill C-27 creates responsibilities with respect to the Copyright Act, as the European Union has done.
The accountability framework outlined in new subsection 12(5) moved by amendment to Bill C-27 could thus support policies and procedures concerning the Copyright Act and the use of an individual's voice, image or reputation.
These additions would be consistent with the regulations being introduced at the international level and would foster the development of a licensing market based on consent and remuneration of rights holders.
Thank you for your attention. I will be pleased to answer your questions.
[English]
Dear Chair and members of the committee, good morning. Thank you for the opportunity to participate in this important work.
My name is Dave Forget, DGC's national executive director. With me today is Sam Bischoff, manager of policy and regulatory affairs. We appreciate the committee's invitation.
Generative AI threatens the ecosystem of creativity on an existential level. Creators should have the right to consent and be compensated whenever an AI entity uses their copyrighted content. As we stand at the crossroads of regulating AI to protect against harms, we believe it is crucial to protect creators from the economic and moral harms under AIDA.
The Directors Guild of Canada is a national labour organization that represents over 7,000 key creative and logistical personnel in film, television and digital media industries. Today it also includes over one thousand director members working across the country on screen-based programming.
The Canadian film and television sector generates massive amounts of value, employment and soft power. In 2021-22 the entire screen sector value chain directly contributed an estimated 337,000 jobs, $16.6 billion in labour income and $23.3 billion in GDP to the Canadian economy. However, artificial intelligence threatens the core of this ecosystem. Large language model developers are reproducing extensive amounts of creative works for commercial purposes without the authorization and fair compensation of authors.
Copyright remains a central framework law for governing our industry. Any unauthorized copying to train AI models is theft. Moreover, it is very difficult for rights holders to know when their works have been used without their consent in training AI models. Creators should be able to control whether their works are copied and used for mining purposes in the first place. Transparency in AI systems must be a prerequisite to defending authors' rights. This is a fundamental element to secure a future where human creativity can flourish.
In its current form, AIDA is failing to protect and uphold fundamental copyright principles. We need AIDA to do the following, consistent with the protections being provided to creators under the EU Artificial Intelligence Act, also known as EU AIA.
One, confirm that the use of copyright-protected content requires the authorization of the rights holder. This would be subject to the limited exception in copyright for technical ephemeral copying, which represents Canada’s very limited exception for text and data mining activities, to the extent that it applies.
Two, general-purpose AI systems like ChatGPT must be transparent about the materials used for training purposes. They should provide a description of information on the data used for training, testing and validation, as well as how this data was obtained and selected.
Three, providers of general-purpose AI models must be required to put in place a policy to respect Canadian law on copyright, including obtaining consent for text and data mining purposes. Any provider who makes available a general-purpose AI model in the Canadian market should comply with this obligation, regardless of the jurisdiction in which the AI training takes place.
I'll turn it over to Sam.
:
Our European counterparts were able to secure these crucial rights to protect their cultural industries. There should be no reason the government can't provide the same level of protections in Canada. Large and well-funded platforms like Google, OpenAI and Microsoft should be required to operate on the same playing field in Canada as they will have to in the European Union.
The Canadian government must ensure that creators are fully empowered to exercise their rights and make informed decisions. We believe that, unlike the provisions of the EU AIA, which include an opt-out regime associated with a text and data mining exception, Canadian creators and rights holders should benefit from an opt-in system to license their works.
An AI tool cannot originate artistic work or supplant human creativity. The value of creativity is not captured or understood by the AI processes. Despite claims by the operators of these tools that their use is transformative, the reality is quite different. Generative AI tools do not genuinely transform. Instead, they exploit and launder the creative works they mine. It is imperative that authors receive fair compensation for the use of existing works and for all future uses.
Members of the committee, we thank you for your time. We would be pleased to respond to your questions.
Thank you for this opportunity to discuss Bill with you from the perspective of Canada's major music labels.
Creating the rules of engagement for AI comes at an important time for the music industry, both here at home and abroad. I want to say, off the top, that our industry is already making use of positive elements of AI as a tool to help artists make more intriguing and interesting music, and using it, again as a tool, to help connect artists, including Canadian artists, with fans all around the world.
Those aspects are, of course, not the central domain of Bill or the reasons why there needs to be further regulation. I will dedicate the rest of my time to telling you where you can help us most.
As we saw with illegal downloading in the previous generation, the use and ownership of music is a valuable canary in the coal mine. Since then, we have learned the importance of regulating technology for its practical and common use rather than building exceptions into our laws and economic frameworks for corner cases. We've learned that the value of music and other forms of creative expression cannot be sacrificed to the drumbeat of technological revolution, and we've learned that quality, safe and licensed music is as popular with music fans as it is with the artists who are paid when their music is played.
That is why Music Canada is supportive of the efforts made in Bill regarding the regulation of generative AI.
There are three places where we would encourage you to go even further.
The first involves the need for AI developers to maintain and make available records of the material that was ingested and used for training. Much of the economic framework for the industries that will be affected by the further flourishing of AI requires that everyone understand what the AI is trained on. In order to truly understand that, developers must keep these records.
You will hear from the most excited proponents of unharnessed technology that this request is somewhere between missing the point and being impossible. I ask that the committee think about it in this way: If AI has the potential to cure diseases, design new and better cities for the future, and make travel plans for busy MPs a little more doable, then surely it can generate a spreadsheet or write a bibliography.
The second place the bill can go further is in requiring the labelling of solely AI-generated images and videos, especially in cases where they impersonate an individual. Right now, today, we are standing at the edge of the uncanny valley with AI. Once you learn what to look for, you can understand that the image of the pope in the white puffy jacket is not a photo of the pope, but this technology will never be worse than it is today. Every day it is getting better and, in many ways, more dangerous when it comes to the powerful potential for deception and misinformation. Requiring labelling is an important step towards addressing this.
The third is with respect to the need to address deepfakes and voice clones as a threat and to prepare our legal system so that we can all agree that the production of deepfakes and voice clones without the consent of the cloned person is wrong.
As elected members of Parliament, you know that it takes a lifetime to build the reputation that brings you to this House of Commons. You also know that it takes just one moment for that to come crashing down. Increasingly, this is a fact that people across all professions, livelihoods and ages are coming to grips with in the face of the proliferation of deepfakes and the ease with which they can be produced.
Abacus Data has found that exposure to deepfakes is common and that Canadians are worried about the risks. One out of every two Canadians has mistaken a deepfake for a real video. It's worse for younger Canadians, because 77% have been deceived and 15% say that it happens all the time. Canadians are worried about the effect of deepfakes on artists, political leaders and business leaders, but 79% of Canadians worry about it for themselves too. Almost unanimously, 93%, Canadians agree that there should be a right to prevent these impersonations.
Now is the time to strengthen Bill and all of our laws to ensure that antiquated analog laws that were once designed to protect celebrities' images from being used against their consent in magazine ads are prepared for the digital realities for everyone today.
Now, some will ask: What about free speech? When it comes to deepfakes, the answer is simple: Putting your words in my mouth is not free speech.
What about parody? Deepfakes aren't parody. They don't mimic with deliberate exaggeration for comic effect. They are done to deceive, misinform and steal one person's character for the advantage of another. We should make clear that in 2024, in a digital setting, that is illegal.
I thank you for your time and I look forward to your questions.
[English]
For other witnesses who want to contribute to that, I think what we're talking about, then, is sourcing data. It seems, from all testimony, that everyone is in agreement that we want to see that.
Now, the question is whether that's in the Copyright Act, with other consultations now, or in AIDA itself. I understand that it's going to be in AIDA. There are certain parts that point to the Copyright Act, but that's what we want to see.
Do any witnesses want to talk about that or comment on that concern?
:
Mr. Chair, I thank the member for the question.
Look, I think that ultimately it's very important for us at this stage to remember that the technology that can amuse us is the same that can terrify us. That site in particular is of interest, and it can make us laugh.
We may one day be in a position where those things can be licensed, but when those things are stolen, basically, the information and the data needed in order to create that is stripped from the Internet and copyright is infringed. All artists who are reproduced in that way have had it done without their consent and without compensation, and it is of great concern to us.
I think it is very important, as we reach out in this sort of first try, that we keep some of those core principles in mind. We know that we wouldn't accept that in any other fashion. The fact that it's done rapidly in large quantities doesn't make it any more legal.
:
Thank you for that question.
Our members are actors. They are performers. They are in film and TV. They are on video games. They are very much subject to the deepfakes and abuses that are taking place. What we want to bring to light here is that when it comes to actors and performers, we are kind of on the outside looking in to the protections of copyright. We take a step back and say that the first problem for actors is that they don't have moral rights under the Copyright Act to protect them to begin with. Musicians got moral rights when Napster came in a couple of decades ago and started stealing their music. Today we have the same situation with deepfakes stealing the image of performers. We don't have moral rights in the Copyright Act. We say that needs to be taken care of right away.
When we look at Bill , we also see that under the definition of harm, you have “physical or psychological harm”. I would suggest that if we have to prove psychological harm, we'll have to get the DSM out, put it down on a table in a court of law, and explain the condition that was produced. That can't be the level for a performer. Then there's “damage to an individual’s property”. As I just told you, we don't have copyright protection. Now we go to the third one, “economic loss to an individual”. What we have to understand is that performers are precarious workers. Every day they audition for the next job. It might be one day on set in a series or a film. Every day they have to look for that job. How do they prove that they didn't get that role in Law & Order? How do they prove that as an economic loss?
We need to have the damage to an individual's reputation. Eleanor Noble makes a living based on what you see here. Her name, her image, her likeness, she caretakes that with everything she does, whether business or personal, because she knows that her next gig relies on it, and yet she is subject to the deepfakes that are happening out there.
Our data is easily captured now. Everything is streamed. Everything is on your phone. It's on your computer. It's readily available to be grabbed and used or misused. We really need this committee to take a look at the impact for performers in this country.
:
Thank you to all of you for being here.
I really appreciate your raising your voices in this important conversation. You all represent incredibly important players and members who are across our creative industries and whom I think we're all deeply concerned about when it comes to AI and the harms it can cause to individuals who are earning their living and who, in many cases, I think, develop that reputation over many years and with lots of hard work. I empathize with all of your positions. I've read a lot of your submissions and materials in advance. I really appreciate your being here. Let me just start with that.
One thing I keep hearing is conversations on intersecting with copyright. That's fair enough. I get that there are intersections of Bill and copyright, although we know that Bill C-27 doesn't deal with copyright. The Government of Canada is doing consultations and round tables. They have done seven round tables already.
I want to start by asking each one of you—maybe one representative from each group—if you have been consulted and are participating in the copyright consultation process. The Government of Canada is looking at whether this conversation merits amendments to the Copyright Act as a separate process, but not involved in the scope of this bill.
Ms. Noble or Ms. Kelly, has ACTRA been involved in that consultation?
Thanks to our witnesses here today.
Last week was really interesting. We had Google, Microsoft, Amazon and Meta here. I, quite frankly, was shocked by the fact that we had a panel in front of us that had been fined in the multi-billions of dollars across the world, yet we haven't had any of the same kind of oversight here. They also have the distribution rights to many of the works you actually perform here.
Since that time, in fact, Microsoft, which was here, has been challenged in its takeover of Activision, which affects many of you and the people you represent. They have now identified that they're going to lay off a whole bunch of people at Activision, when they previously said they wouldn't, so the U.S. is taking stronger steps there.
I would just like to go across the panel right now because we have to decide on this bill, which basically moves a lot of stuff to regulation. At best, it will be implemented in probably three years' time, or we can rework it across the board in terms of starting almost from the beginning. That's also because the government is unwilling to separate the Privacy Act aspects of this, where I think there's quite a lot of common ground, from the AI stuff.
Maybe we'll start with ACTRA here and go across.
Should we start over, or should we try to continue to work? I'm on the fence on this. Quite frankly, I was really disappointed with last Wednesday's.... I've never seen a panel, in all my years here, where we literally had companies, representing the influence of so many Canadians, that were fined and paid those fines—and lawsuits—including to other governments across the world, for billions of dollars. We've never had a panel like that, and yet they walked in and walked out of the room, just like we were nothing at all. They sent in government relations people, including people who were actually bought from government teams of the past, from government relations.
Do we go ahead with a process that exposes us to potential regulation that's devolved from Parliament in many respects—to be updated—or do we try to rework things and put Parliament back in the front seat?
:
From my perspective, its not okay for government to do nothing. Across the globe, governments are struggling with this issue, and we do appreciate that this is a very difficult issue that touches on so many different industries, that touches on business as well as people. We appreciate that it's difficult. We also believe it's going to be a patchwork of protections that are going to come in. It can't just be Bill .
Growing up doing some lobbying in my past life, I was always told, “Get what you can now because government's not going to revisit this for another decade.” That can't be what happens here. We have to move forward as best we can, at every opportunity we have, on protections for Canadians, for workers, for our society.
What I would say to you, on Bill , is that we support the intention to ensure that consent is required for biometric information. We understand that it's going to start to protect name, image and likeness, but you're hearing us say, even on Bill C-27, that it doesn't go far enough for performers. We need greater protections within this bill, but you have to move. I would just say to the government, you have to move with speed.
:
Thank you for the question.
My answer is going to be, yes, we should be moving forward, but the context is moving so quickly. Bill was drafted before we had the impact of generative AI in the way we see it now. It was only a little over a year ago, with my elected board, that this switched from being in the background to front and centre.
I can echo some of the comments you've heard. In our own surveys of our membership, who work not just as directors but across 50 different job categories, it impacts them in different ways, and it impacts them profoundly.
This is a major concern, so moving quickly but making the improvements, some of which we're happy to be discussing here today, precisely to be able to protect creators, is really important.
Move forward in a thoughtful way, but try to do it quickly. That would be our advice.
:
Mr. Masse, thank you for the question.
Look, I would encourage everyone to continue moving forward with Bill in its original state, which was a framework for all of these other pieces to hang on. I think that if I were you or any member on this committee, I would go to caucus on Wednesday morning, go to the microphones and say, “I heard really scary things about deepfakes and we have to do something on that now.”
If it takes longer for Parliament to work through Bill , that's fine, but I think there are some actions you could take right now to take real, meaningful action for our industry and, in fact, for all Canadians.
:
Yes. Thank you for the opportunity to comment on that.
You can go about it either way. Either you can say that there are no exceptions for AI, that AI is like everything else, and you can do it in a bill like Bill and go back and reference the Copyright Act, or you can make the change in the Copyright Act and say that this is the case.
We didn't create copyright for the printing press. We created copyright for Dickens and the recognition that the work was worth more than what you paid for it right away, and we extended term of copyright for sound recordings because people were starting to live to the point at which they could hear their song on the radio and not get paid, so we made that change.
If we say that we know they're scraping our stuff, and we know that's a use—it's of value—we can just agree now that that's the case and get out of those sorts of fun academic conversations about “I don't know. Is it a copy?” I know it's a copy. I know they're taking it because our stuff is a thing of value.
:
I think it's going to be a lot of threading together of different things.
Copyright is key. You are hearing us say that as actors. I think you need to have protection on the data you're looking at in Bill . I think it's very important for us to look at how it's scraped and what they're doing with it. We need to have knowledge about where this data is coming from in order for us to even be able to trace bad actors—and good actors who just happen to take it and may not know.
We're looking at things like this: What are you going to do with a worker who has their data taken from them by their employer so they can generate a program—say, a training session, etc.? Why not put something in the Employment Standards Act that protects all workers against having their name, image and likeness taken without consent, control and compensation?
Privacy laws have to be increased so we have those protections.
I'm sure there's more than that. This is going to be a patchwork.
:
I think it's a good illustration of the dichotomy of individual rights versus collective rights. We made the point a little bit earlier about the extent to which we don't think there's a comprehensive view on what we would call collective rights. We're in the business of negotiating collective rights, so we see this all the time.
I'd say, in terms of bad actors, that one of the remedies to bad actors is encouraging good actors. The way you do that is by having order in the marketplace and by having a predictable marketplace where you have music, film and TV widely available in a way that is affordable so that customers can engage and buy. That's how you discourage bad actors.
I'd like to leave you with one thought that I think is relevant here. We talked a lot about the extent to which the problems that arise with.... By the way, one of the direct answers to your question is that one of the ways we can discourage this is to prevent the dilution of value by ensuring that those players who are maybe not bad actors but are surfing off others' existing work to create new things acknowledge it, have the consent and have a model. There's an economic imperative here, too. We're happy to—we make these agreements all the time—sit down and negotiate what a licence agreement may be, and we're seeing more of that happening, so it's obviously possible.
My last comment would be to point out the perverse logic. The same entities that are busy mining copyrighted works to create something new want to disregard the copyright on the input but then seek the protections of copyright on the output. I want to point out that—just to get it on the record—humans are the creative drivers here, not software. Also, coming back to make the connection with an orderly, predictable marketplace, it's problematic to have material that you have not licensed feed into something that isn't made by a human.
I guess the question is this: What becomes of that in terms of...? When I think about the work that our members do, I see that it's millions of dollars of investment in creating film, television and digital media. There's a lot at stake, and investors are going to want to know that they have a path to be able to exploit those works and generate revenue.
I'm sorry for the long answer, but there were many parts to the question.
:
Thank you very much, Mr. Chair.
Ms. Desrochers, witnesses have appeared before our committee and told us we should perhaps have a federal government registry to increase the security of the environment in which all kinds of artificial intelligence models are deployed. When they have a high-impact model, companies should hand over its code and get a risk mitigation plan.
Getting back to cultural diversity, what's interesting is that the representatives of the Googles, Apples, Facebooks and Amazons of the world who have testified here defined high-impact and high-risk models involving people's health, safety and, I believe, integrity. Do you think cultural diversity should be included in this definition? If so, how could it be operationalized?
:
That's a good question.
First, even if all our works were processed by machines, they would still constitute a minority in all the information they process. Consequently, I don't think that would be enough to protect the diversity of our cultural expressions or to adequately reflect our culture in those models.
Many people are now examining the issue of minority languages and cultures. All kinds of projects are being developed to determine how AI can help propel those minority cultures or to ensure that they're protected.
Lastly, we can consider the possibility of putting innovative solutions in place to ensure that our culture continues to occupy its position in an environment where AI has been installed, but while retaining control over our data and stories as much as possible. All kinds of proposals are currently circulating.
It's acknowledged that the development of AI reproduces the dynamics of domination and hegemony that we already see in the environment. Consequently, we shouldn't sell our available data cheaply, without consent or in conditions we don't control, and hope that Quebec culture is suddenly better represented in AI systems.
:
I was at a couple of conferences in the United States this summer for part of my Canada-U.S. stuff. Even some of the companies were talking about how they're trying to fix their ethnic and cultural biases of actual input going into artificial intelligence and how they're building their models. They admitted that there are major deficiencies.
I guess what you're saying is that the information that's now collected on the artist could then be replicated and used in biased representations across multimedia platforms for generations, and the person could still basically be walking around there.
It's similar to what you said, Mr. Rogers, with regard to the artist. I thought that was really interesting, because you're right. I was here for the copyright review. Part of it was that they could literally hear themselves, because they're living longer. That can also be a legacy of the person.
Just quickly, I know that we all sign contracts sometimes where we give away our privacy and it's all mishmash and stuff like that. Is it the same in the industry? Do artists have to figure out what they're giving up with these long forms and everything else at the last moment? Is that kind of vulnerability out there?
:
That's a good question.
I would like to give you a little bit of insight into the life of performers. Number one, they know very well that if they are difficult, or perceived as difficult on the set, that will get around, and they won't get another job. Performers show up wanting to please the director, the producer, and people on the set. They show up far too often in the morning at 6 a.m., showing up for their hair and make-up. They're given a contract and told, “Just sign this, or else.” They're given a thick contract. They're not lawyers. They don't have a lawyer with them, but they know the reality that if they don't sign, giving away whatever it is that contract asks them to give away, they're not going to get that job, and maybe another job.
They're precarious workers who really have to be concerned about their next job. They can't be the ones who are holding up the rights that they should have in this society. You've heard about the struggling performer or the struggling actor who has to have a second job, often in a bar or a restaurant. That's the truth. They can't pay the rent on the income they make working in the job they love, and then they have to face the realities of being perceived as not being easy to work with.
They sign these contracts, and they don't know what they're giving away.
:
That's not what I said.
I'm glad you asked me the question, and I thank you for it.
Since publishing is a global industry, we have frequent discussions with international partners. However, our foreign counterparts are at times surprised to see that Canadian copyright legislation lags behind the rest of the world in all sectors.
The purpose of the European directive is to protect cultural expressions, which I believe is one of the objectives of Europe's artificial intelligence legislation and also a boon to the francophonie and to all languages. However, it's even more important, in an AI context, to have good public policies to support minority cultural industries.
English is obviously a dominant language that travels more easily than others, but that's one of the challenges for Canada's anglophone market because the large American market just next door competes with it.
As for the francophone book publishing industry, Quebec's public policies have truly promoted its development, unlike other cultural industries, and the numbers are there to show it. A 50% market share, a very good number, has been achieved as a result of Quebec's and Canada's public policies, which have been foundational for the development of the book publishing industry.
However, those policies must clearly be updated and modernized. The bill before us is an opportunity to help Canadian culture to continue emerging.
:
It's great to have a bit more time to go back to my line of questioning.
My understanding from reading AIDA with the amendments that have been proposed is that it requires organizations building general-purpose systems with the ability to generate output to make their best effort and ensure that the output of those systems can be detected easily or with the aid of free software.
That's one. I think that's a step in the right direction. I'm going to ask you in a second, but I want to cover a couple of other things.
It also significantly strengthens the enforcement framework for privacy and requires express and meaningful consent when sensitive personal information is being collected, used or disclosed. That, to me, covers biometric information, which I think would apply to all of your actors, creators, performers and directors, etc. Perhaps there are some exceptions.
It also requires the companies that are creating the AI systems to keep records related to the creation and operation of the systems, which may suggest they have to keep records of how they're training their systems.
I understand that we could go deeper there, and some of you would want that, but those seem to be three significant steps to create greater transparency.
I want to go to ACTRA first. It seems to me that these are really positive steps. Would you not agree that those are very positive steps that have been added to the bill?
:
I believe the 's amendments are a step in the right direction on those pieces. I think that, as we've discussed, they can go further.
I am desperate, though, to speak about the idea that because they're big numbers or complicated, they shouldn't be regulated, or there shouldn't be a need for this.
You have access to almost every song ever recorded on your phone right now in a licensed, legal way because there has been an arrangement between the rights holders and the platforms. That's awesome. That was, at one time, described as not doable. People sat here and told parliamentarians, “Do you really want me to go and track down the rights of every song rights holder? Don't you know there's a recording right and a written right? That will take forever.” Now we have a large, flourishing, legal, licensed music process across multiple platforms.
Therefore, this is doable. I beg parliamentarians not to be led down this path of “It's too complicated for you to understand.” You must reject that. We wouldn't accept that in nuclear regulation or bank regulation. We can't allow it in the stealing of arts and culture.
I have a point of clarification.
Some stakeholders, including many of you today, have raised concerns that the artificial intelligence and data act does not adequately protect the copyright of Canadian creators. That's been well established today.
However, Mark Schaan, senior assistant deputy minister of the department, explained to the committee during his appearance in October that the most effective aspects for addressing the copyright concerns are in the Copyright Act, and that the government has announced a consultation regarding the connection between artificial intelligence and copyright. Some of that has been covered.
Just so I get it on the record and the department hears it very clearly, why do you think matters related to copyright must be dealt with explicitly in AIDA rather than in the Copyright Act? Does anyone want to comment on that? Could amendments to the Copyright Act be made instead of, or in addition to, explicit copyright provisions in AIDA? If so, which ones?
If anyone wants to comment on that point of clarification, it would be very helpful.
Mr. Rogers.
:
Thank you. That's very helpful.
Ms. Noble, your testimony in the very beginning struck a chord with me.
I come from the Fraser Valley and I represent the Fraser Valley and the Fraser Canyon. Probably every Hallmark movie in North America has touched on my riding, the number one riding in Canada, Mission—Matsqui—Fraser Canyon. Literally every week last year, I'd drive by downtown Abbotsford or Mission and see movies being made. Then the strike in the United States happened and the industry shut down. In fact, both neighbours on either side of my house work in the film industry and didn't get a paycheque for almost a year. Those families were very hurt. Those are good, high-paying jobs.
First off, what are actors and writers saying with respect to equivalent legislation or equivalent problems in the United States? Where do we need to find interoperability with American laws, specifically for English-language programming, to make sure our writers, entertainers and performers are not disadvantaged in any way?
I wish everyone a happy Monday, and welcome to this committee.
From the testimony of each of you, it was easily garnered that the impact of AI is nothing less than what happened when the printing press was introduced a few hundred years ago. I say that with much historical thought on that front. What happened in the Industrial Revolution was that we were able to put a train on train tracks across the world. There is much emphasis on the opportunities for artificial intelligence in your field and your sector; however, there is also some trepidation you folks have within the AI space or within that technology.
Eleanor and Marie, I'll start off with you. Is the impact of AI greater on the copyright side or the AI side, in terms of generative AI, where you may not need the individuals? I want to get that clarification, because we do have a copyright consultation going on right now, and part 3 of the bill, AIDA, does not pertain to copyright. I want to get your view on this. If you had to split up the two percentages, what would be the impact?
:
Thank you for that question. Thank you also for the reference to the impact of the two strikes last year on our members, particularly in British Columbia.
I would say that over the past year there's been a high level of anxiety. Our members do many different functions, starting with directors but right across the spectrum: picture and sound editing, location managers, production accountants, production coordinators, designers and so on.
In asking members questions around their use of AI, their feelings about where this is going and how it's going to be impacting their job, we hear a different story, naturally. From the designers, we're hearing a very high level of apprehension. Designers are the people who create the world that you see onscreen, so they're responsible for the artwork that's on the wall in the person's home where the character is. Editors are quite concerned about the impact, as are, obviously, directors as well as authors.
Across the spectrum, I would say that many of the members we represent see AI as being transformative. I think that meets the definition in my mind of something that will have a high impact, both for Canadians and the impact of culture, and in the way productions are made.
I have one really quick comment. We're used to innovation. We've been digital for 30 years. We don't use film anymore to make films, so we have been early adopters and eager adopters of new technologies all along the way. You may be right that this is equivalent to the invention of the printing press, but we've had the experience of the introduction of a lot of new technologies that are now incorporated into the work that our members do day to day.
AI, in a nutshell, is seen as something different. It is more significant and more transformative.
I hope that answers your question.
I'd like to comment on the transparency issue. One of my colleagues, Mr. Turnbull, discussed this. He said it might be complicated to determine the identity of works that have been used among billions of data points. However, my impression is that an AI system capable of reading 100 million books a day is capable of searching from a list. You'd have to check that.
That being said, some intervenors have told us that Bill won't get the job done. Many representatives of the web giants told us so, almost implying that we should reject it, start over from scratch, modify all kinds of other acts and work on it for I don't know how many years. We have that option, but there's also the option of moving ahead, continuing to amend Bill and doing the best we can. Then there's the option of waiting and imitating Europe, since Canada is a minor player after all.
However, there's another solution: we could add a provision requiring periodic updates to the act, say every three to five years. That would force Parliament to review the act completely and would give it the opportunity to align the act periodically with the legislation of other countries so that Canada remains competitive, while enabling it to participate in the international review process.
Ms. Hénault, what do you think of that kind of provision?
My first intervention is the challenge of what we do next, because what I think you have demonstrated today is that it's like the argument that we're going to consult you on Bill , and we will fix it sometime on copyright, and we will fix it somehow after we pass Bill C-27. That is not sufficient for the NDP. It's clear to us that you can do both of those things. Alternatively, we either send this to regulatory oblivion—that's really what happens—or dismantle what we have here.
I'm looking at an alternative where we view it through the lens of almost like national security. Perhaps we even have a standing committee of Parliament and the Senate that looks at this over all the different jurisdictions, because copyright is proving that it's just outside this particular bill in terms of the technicality of it, but the reality is that it encompasses everything you have been saying and doing here in a much more wholesome way than in many other industries.
I have one quick question to go across the table here about an AI commissioner. Should the commissioner be independent and able to fine the abuse of artificial intelligence if that is part of the law?
Maybe we can start with ACTRA and go across.