:
Madam Speaker, I am happy to start this week by speaking to Bill . It is quite an extensive bill at over 140 pages in length. It would amend several acts and the most consequential are three of them in particular, as it is an act to enact the consumer privacy protection act, the personal information and data protection tribunal act and the artificial intelligence and data act and to make consequential and related amendments to other acts.
I should start by saying that this is really three pieces of legislation that have been bundled up into one. As New Democrats, we have called for different voting for the third and final part of this act.
The first two parts of the act, concerning the consumer privacy protection act and the personal information and data protection tribunal act, do have enough common themes running through them to be put together into one piece of legislation. I still think, for these issues, that they would have been better as two separate pieces of legislation because one of them is brand new and the first one, the consumer privacy protection act, is the former Bill , which was highly controversial in the previous Parliament.
When we had an unnecessary election called by the , that bill died, along with all of the work from Parliament, which was not concluded, despite extensive lobbying and consultation going, particularly, through the ethics committee at that time. This has now been bundled with some other legislation to go through the industry committee, which is fine.
The personal information and data protection tribunal act is a new component of this legislation. I have some concerns about that element of it, but it does have a common theme, which is worthwhile, and at least it has the potential to be put together and bundled. Although, again, it is extensive, it is a bundling that we can accept.
We have called for a Speaker's ruling with regard to the artificial intelligence and data act, as this is brand new legislation as well, but it does not have the same connections as the previous two pieces, which are bundled together, in the way that one could argue for them. We want a separate vote on the second part of this because the legislation would be studied at committee together.
There will be a high degree of interest in this legislation, since Bill had that in the past. The new bill changes position from Bill C-11 significantly, and I expect that this in itself will garner a lot of chatter, as well as review and interest, from a number of organizations, many of whom we have already heard from as of now.
The other part, with the tribunal, would be another important aspect, because it is a divergence from our traditional way of enforcement and creates another bureaucratic arm. Again, I would like to see more on this, and I am open to considering the idea, but it is certainly different from our traditional private right of law for dispute settlements about data breaches and other types of corporate malfeasance, that actually have to deal with the types of laws that are necessary to bring compliance among people.
This goes to the heart of, really, where a political party resides in their expectations of companies and their use of data, information and algorithms. For New Democrats, we fall very much in line with something I have tabled before, several years ago, which is a digital bill of rights, so that one's personal rights online are consistent with that of our physical rights, where one is expected to be properly treated in a physical world and in the digital format world. That includes one's right to privacy, right to the expectation of proper behaviour conducted toward oneself and right to not be abused. It also includes significant penalties to those who do those abuses, especially when we are looking at the corporate world.
Where this legislation really becomes highly complicated is the emergence of artificial intelligence, which has taken place over the last decade and will be significantly ramped up in the years to come. That is why the European Union and others have advanced on this, as well as the United States.
Our concern is that this bill tries to split both worlds. We all know that the industries of Google and other web giants have conducted significant lobbying efforts over the last number of years. In fact, they have tripled their efforts since this administration has come into place and have had a direct line of correspondence about their lobbying, which is fine to some degree, but the expectation among people that it would be balanced does not seem to be being met.
I want to bring into the discussion the impact on people before I get into the technical aspects of the bill, as well as the data breaches that remind us of the need for protection among our citizens and other companies as well. One of the things that is often forgotten is other SMEs, and others can be compromised quite significantly from this, so protecting people individually is just as important for our economy, especially when we have the emergence of new industries. If they are behaviours that are hampered, manipulated or streamed, they can become significant issues.
I want to remind people that some of the data breaches we have had with Yahoo, Marriott, the Desjardins group and Facebook, among others, have demonstrated significant differences in the regulatory system between Canada and the United States and how they treat their victims. A good example is a settlement in the U.S. from 2009 with the Equifax data breach, where Equifax agreed to pay $700 million to settle lawsuits over the breach in agreement with the U.S. authorities, and that included $425 million in monetary relief to consumers. We have not had the same type of treatment here in Canada.
This is similar to the work I have done in the past with the auto industry and the fact that our Competition Bureau and our reimbursement systems are not up to date. We have been treated basically as a colony by many of the industries when it comes to consumer and retail accountability.
We can look at the example of Toyota and the data software issue, where the car pedal was blamed for the cars going out of control. It turned out this was not the case. It was actually a data issue. In the U.S., this resulted in hundreds of millions of dollars of investment into safety procedures. We received zero for that. Also, consumers received better treatment, where their vehicles were towed back to different dealerships to be fixed. In Canada, consumers did not receive any of that.
The same could be said with Volkswagen, another situation that took place with emissions. Not only did we not receive compensation similar to that of the United States, we actually imported a lot of the used Volkswagen vehicles from Europe. However, that was of our own accord and time frame when those vehicles were being sunsetted in those countries because of emissions.
In the case of Facebook, the U.S. Federal Trade Commission was able to impose a $5-billion fine for the company's violation, while the Privacy Commissioner's office was forced to take the company to federal court here in Canada. One of the things I would like to point out is that our Privacy Commissioner has stood up for the needs of Canadians, and one of the concerns with this bill would be the erosion of the Privacy Commissioner's capabilities in dealing with these bills and legislation.
The Privacy Commissioner has made some significant points on how to amend the bill and actually balance it, but they have not all been taken into account. One of the strong points we will be looking to is to see whether there are necessary amendments from our Privacy Commissioner on this.
One of the big distinctions between Canada and the United States, which is to our benefit and to Canada's credit, is the office of the Privacy Commissioner. Where we do not have some of the teeth necessary for dealing with these companies, we do have the independent Privacy Commissioner, who is able to investigate and follow through at least with bringing things to a formal process in the legal system. It is very laborious and difficult, but at the same time, it is independent, which is one of the strengths of the system we have.
If the government proceeds, we will see the bill go to committee, which we are agreeing to do. However, we do want to see separate voting. Before I get into more of the bill, I will explain that we want to see separate voting because we really distinguish that this is inappropriate. The artificial intelligence act is the first time we have even dealt with this topic in the House of Commons, and it should be done differently.
We will be looking for amendments for this, and big corporate data privacy breaches are becoming quite an issue. Some of these privacy breaches get highly complicated to deal with. There have been cases with cybersecurity and even extortion. The University of Calgary is one that was well noted, and there have been others.
We need some of these things brought together. The bill does include some important fixes that we have been calling for, such as stronger enforcement of privacy rights, tough new fines, transparency in corporate decisions made by algorithms.
I have pointed out a lot of the concerns that we have about the bill going forward because of its serious nature. However, we are glad this is happening, albeit with the caveat that we feel the bill should be separate legislation. The does deserve credit for bringing the bill forward for debate in the House of Commons.
Bill should have been passed in the last Parliament, but here we are again dealing with it. The new tribunal is the concern that we have. It could actually weaken existing content rules, and we will study and look at the new tribunal.
The tribunal itself is going to be interesting because it would be an appointment process. There is always a concern when we have a government appointment process. There is a concern that there could be complications setting up the tribunal, such as who gets to go there, what their background is, what their profession is and whether there will be enough support.
One of the things that gives me trouble is that the CRTC, for example, takes so long to make a decision. It is so laborious to go through and it has not always acted, most recently, in the best interest of Canadians when it comes to consumer protection and individual rights. It gives me concern that having another tribunal to act as a referee instead of the court system could delay things.
Some testimony has been provided already, some analysis, that suggests the tribunal might end up with lawsuits anyway, so we could potentially be back to square one after that. The time duration, funding, the ability to investigate and all these different things are very good issues to look at to find out whether we will have the proper supports for a new measure being brought in.
Another government resource for this is key. At the end of the day, if it is a tribunal system that is not supportive of protecting Canadians' privacy and rights, then we will weaken the entire legislation. That is a big concern because that would be outside Parliament. The way that some of the amendments are written, it could be coming through more regulatory means and less parliamentary oversight.
Who is going to be on the tribunal? How will it be consistent? How will it be regulated? I would point to the minister providing the CRTC with a mandate letter, which is supposed to emphasize the public policy direction it should be going. In my assessment, the CRTC, over the last number of years, has not taken the consumer protection steps that New Democrats would like to see.
When it comes to modernizing this law, we do know that this will be important to address because there are issues regarding the data ownership, which is really at the heart of some of the challenges we face. There is algorithmic abuse and also areas related to compensation, enforcement, data ownership and control, and a number of things that are necessary to ensure the protection of people.
We can look at an area where I have done a fair amount of work related to my riding, which is automobile production. There has been the production of the car and the value there, but there will also be the data collection. The use of that data collection can actually influence not only one's individual behaviour, but also that of society. That is a significant economic resource for some of these companies.
It is one of the reasons I have tabled an update to my bill on the right to repair. The right to repair is a person's ability to have their vehicle fixed at an auto shop of their choice in the aftermarket. The OEMs, the original manufacturers, have at times resisted this. There have been examples. Tesla, for example, is not even part of what is called the voluntary agreement, but we still do not have an update with regard to the use of data and how one actually goes about the process of fixing the vehicle.
It also creates issues related to ownership of the vehicle, as well as insurance and liability. These could become highly complicated issues related to the use of data and the rules around it. If these types of things are not clear with regard to the process of rights for people, expectations by those who are using the data, and protection for people, then it could create a real, significant issue, not only for individuals but for our economy.
Therefore, dealing with this issue in the bill is paramount. A lot of this has come about by looking at what the GDPR, the general data protection regulation, did in European law. Europe was one of the first jurisdictions to bring forth this type of an issue, and it has provided an adequate level of protection, which is one of the things Europe stands by with regard to protection of privacy. There have been some on the side over here in North America who have pushed back against the GDPR, and even though this landmark legislation has created a path forward, there still is a need for transparency and to understand what the monetary penalties for abuse are going to be, which are also very important in terms of what we expect in the legislation.
Erosion of content rights is one of the things we are worried about in this bill. Under Bill individuals would have significantly diminished control over the collection, use and disclosure of their personal data, even less than in Bill . The new consent provisions ask the public to install an exemplary amount of trust to businesses to keep them accountable, as the bill's exceptions to content allow organizations to conduct many types of activities without any knowledge of the individuals. The flexibility under Bill allows organizations to state the scope not only of legitimate interests but also of what is reasonable, necessary and socially beneficial, thus modelling their practices in a way that maximizes the value derived from the personal information.
What we have there is that the actors are setting some of the rules. That is one of the clearer things that we need through the discussion that would take place at committee, but also from the testimony that we will hear, because if we are letting those who use and manage the data make the decision about what consent is and how it is used, then it is going to create a system that could really lead to abuse.
There is also the issue or danger of de-identification. Witnesses, artificial intelligence and people being able to scrub much of their data when they want and how they want is one of the things we are concerned about. There is not enough acknowledgement of the risk that is available in this. That includes for young people. We believe this bill is a bit lopsided towards the business sector at the moment, and we want to propose amendments that would lead to better protection of individual rights and ensure informed consent as to what people want to do with their data and how they want it to be exercised as a benefit to them and their family, versus people being accidentally or wilfully brought into exposure they have not consented to.
As I wrap up, I just want to say that we have a number of different issues with this bill. Again, we believe there should be a separate vote for the second part of this bill, being the third piece of it. It is very ambitious legislation. It is as large as the budget bill. That should say enough with regard to the type of content we have. I thank the members who have debated this bill already. It is going to be interesting to get all perspectives. I look forward to the work that comes at committee. It will be one that requires extensive consultation with Canadians.
:
Madam Speaker, I will be sharing my time with the member for .
I am pleased to rise in the House today to speak to the , in particular the aspect on the consumer privacy protection act. If I have time, I will also discuss the artificial intelligence and data act.
I am very proud to speak to these two pieces of legislation that introduce a regime that seeks to not only support the technological transformation, but also help Canadians safely navigate this new digital world with confidence. These past few years, Canadians have witnessed these technological shifts take place. They have taken advantage of new technologies like never before. In 2021, more than 72.5% of Canadians used e-commerce services, a trend that is expected to grow to 77.6% by 2025.
According to TECHNATION, a 10% increase in digitalization can create close to a 1% drop in the unemployment rate. What is more, every 1% increase in digitalization can add $8.7 billion to Canada's GDP. In order to take advantage of those major benefits for our economy, we must ensure that consumers continue to have confidence in the digital marketplace.
Technology is clearly an intrinsic part of our lives, and Canadians have growing expectations regarding the digital economy. It is absolutely essential that the Government of Canada be able to meet those expectations.
With this bill, the government is putting forward a regime that gives Canadians the protection they deserve. First, as stated in the preamble of the digital charter implementation act, 2022, Canada recognizes the importance of protecting Canadians' privacy rights. Similarly, the 2022 consumer privacy protection act also provides important protections for Canadians.
That said, our government has listened to the input of various stakeholders, and we have made changes to improve this bill. I was on the committee in the last Parliament, and there was a lot of discussion about the previous bill, Bill . I am very pleased to be able to speak to Bill , so that we can get all that work done in this Parliament.
One of the most important changes we have made is enhancing protection for minors. Some stakeholders felt that the previous legislation did not go far enough to protect children's privacy. I agree. Consequently, the bill was amended to define minors' information as sensitive by default. This means that organizations subject to the law will have to adhere to higher standards of protection for that information. The legislation also provides minors with a more direct route to delete their personal information. This will make it easier for them to manage their online reputation. I think this is a really important change, because we know that young people are very aware and very capable of using all types of digital platforms, but at the same time, we need to make sure that they are able to protect their reputation.
In addition to protections for minors, we also made changes to the concept of de-identification of personal information. According to many stakeholders, the definitions in the old bill were confusing. We recognize that having well-defined terms helps ensure compliance with the act and provides more effective protection of consumers' information. In that regard, I understand that, because we are talking about new technologies and an evolving industry, it is important for all members to share their expertise, since that will help us develop a better piece of legislation.
The difference, then, between anonymous information and de-identified information needs to be clarified because, clearly, if information is de-identified but an organization or company is able to reidentify it, that does not serve the purpose of having anonymous information.
Data-based innovation offers many benefits for Canadians. These changes contribute to appropriate safeguards to prevent unauthorized reidentification of this information, while offering greater flexibility in the use of de-identified information.
The new law also maintains the emphasis on controlling the use of their personal information by individuals. That remains a foundation of the law, namely that individuals must be able to fully understand the purpose for which information will be used and consent to that purpose in the most important circumstances.
However, the modern economy must also have flexible tools to accommodate situations that are beneficial but that may not require consent if the organization respects certain limits and takes steps to protect individuals.
The approach advocated here continues to be based on the concept of individual control, but proposes a new exception to consent to resolve these gaps as a tool for safeguarding privacy. The new provisions propose a general exception to cover situations in which organizations could use personal information without obtaining consent, provided that they can justify their legitimate interest in its use for circumstances in which the individual expects the information to be used.
In addition, to prevent abuse, the exception is subject to a requirement that the organization mitigate the risk. For example, digital mapping applications that take photos of every street and that we use to view them, particularly to help with navigation, are widely accepted as being beneficial. However, obtaining individual consent from every resident of the city is impossible.
I believe that everyone in the House will agree that it is hard to imagine how we managed before we had access to those navigation applications. Last evening, I had a visit with a family member in Ottawa and was very happy to have my mapping application to find my destination.
The presence of an exception, combined with a mitigation requirement, therefore allows individuals to take advantage of a beneficial service while safeguarding personal information. The example shows another key aspect for building trust and transparency. Digital mapping technology presents a certain level of transparency. The vehicles equipped with cameras can be seen on our streets and the results can also be seen posted and available online.
However, there are some technologies or aspects thereof that are more difficult to see and understand. That is why the bill continues granting individuals the right to ask organizations for an explanation regarding any prediction, recommendation or decision made in their regard by an automated decision-making system.
What is more, these explanations must be provided in plain language that the individual can understand. These provisions also support the proposed new artificial intelligence act. However, I do not think that I have time to get into that, so I will end there.
:
Madam Speaker and hon. colleagues, I rise today to speak about the digital charter implementation act, 2022, also known as Bill .
I thank the member for for sharing her time with me today.
It is an important discussion that is happening among Canadians about what our digital environment looks like. As we know, over the past few years, we have witnessed the constant evolution of our digital environment. Canadians have been successfully navigating through this changing environment, but they have also made it clear to us that they want better protection of their privacy. They want to be able to benefit from the latest emerging technologies with the confidence that they can be used safely. Canadians also believe that organizations need to be fully accountable for how they manage personal information and how they go about developing powerful technologies, such as artificial intelligence, or AI.
From the beginning of our consultations on digital and data, stakeholders have stressed the importance of maintaining flexibility to innovate responsibly and maintain access to markets at home and abroad. I am proud to say that the digital charter implementation act, 2022, which would enact the consumer privacy protection act, or CPPA, and the artificial intelligence and data act, or AIDA, would do just that.
The CPPA represents a complete transformation of Canada's private sector privacy regime, the Personal Information Protection and Electronic Documents Act, or PIPEDA, which came into force in 2001. That was 20 or so years ago. CPPA would introduce significant changes to better protect Canadians' personal information, including strong fiscal and financial consequences for those who seek to benefit from curtailing their legal obligations. This new framework would also ensure that all Canadians could enjoy the same privacy protections as individuals have in other countries.
The AIDA, for its part, is being proposed to build confidence in a key part of the data-driven economy. This part of the bill would introduce common standards for responsible design, development and deployment of AI systems. It would also provide businesses with much-needed guardrails for AI innovation and would ensure that Canadians can trust the AI systems that underpin the data economy.
PIPEDA was passed at the start of the century when other countries and some provinces were moving forward with privacy laws governing the private sector. Recognizing the potential for a patchwork of provincial privacy laws to emerge and the need to align internationally, Canada put in place PIPEDA as a national privacy standard. It drew on best practices to provide robust privacy protections for increased consumer confidence and a consistent and flexible regulatory environment for businesses that allowed for legitimate use of personal information.
The key element for alignment was the recognition of provincial private sector privacy laws as substantially similar. This meant that, where such a law is given that designation, PIPEDA did not apply to an organization's activities within that province. PIPEDA would continue, however, to apply to the federally regulated sector in that province and to any personal information collected, used or disclosed in the course of commercial activities across borders. This has provided a stable regulatory environment and flexibility for provinces, and it has supported Canada's trade interests well for many years.
Today, history is repeating itself, but the stakes are much higher. The role of the digital economy is far more central to our lives than it was 20 years ago. To harness all that the modern digital world has to offer, we clearly need to modernize our federal private sector privacy law. The provinces are moving in that direction and, again, the risk of fragmentation looms.
Quebec has amended its private sector privacy law, and B.C. and Alberta are examining their private sector privacy laws as well. Ontario too is considering introducing a new private sector privacy law. Therefore, the federal government must act now to ensure that all Canadians benefit from a substantially equivalent degree of protection and facilitate compliance for organizations that do business across the country.
Like PIPEDA, the CPPA is grounded in the federal trade and commerce powers. It builds on the best practices developed internationally and by Canadian provinces, and it foregrounds the importance of the ease of doing business across boundaries. The CPPA replicates the approach under PIPEDA, and it updates the mechanism in regulations for recognizing provincial laws as substantially similar. The regulations will set out the criteria and process for such recognition and will continue to provide the flexibility that has been important to PIPEDA's success.
CPPA, like its predecessor, would also maintain the Privacy Commissioner's ability to collaborate and co-operate with his or her provincial counterparts. This is an important tool to ensure consistency, guidance and enforcement, and one that has enabled our commissioners to lead the world in privacy collaboration and co-operation.
Canada also needs to move proactively to regulate in the AI space, given that the operation of these systems transcends national and provincial borders in the digital environment. AIDA would create a common standard that all organizations involved in international and inter-provincial trade and commerce would have to meet. AIDA would place Canada at the forefront of international regulation in the AI space and would provide clear rules across the country. This would spur innovation and build confidence in the safety of AI systems used or developed in Canada.
We live in an interconnected world. Data is constantly flowing across borders. In 2001, the European Commission recognized PIPEDA as providing adequate protection relative to EU law, allowing for the free flow of personal information between Canadian and European businesses.
In 2018, a new EU regulation came into effect that was known as the general data protection regulation. It updated many of the existing requirements and added strong financial penalties for contraventions. The EU is currently reviewing its existing adequacy decisions, including the one that applies to Canada. We expect to hear more on the outcome of this review soon.
The CPPA would make a positive contribution to maintaining Canada's adequacy with the EU privacy regime. It would enable personal data from EU businesses to continue to flow to Canada without additional protections. Beyond the EU, the changes proposed in the CPPA would represent important updates that would bring us in line with other international jurisdictions that have updated their laws. It would ensure interoperability with consistent rules, rights and consequences.
Other jurisdictions internationally are also moving ahead on their AI regulation, and strong action is needed to maintain Canada's leadership position internationally. Interoperability with international partners remains a key priority. The EU in particular has advanced a framework for regulating AI that would set standards for any AI systems being deployed in the EU market.
AIDA would propose a risk-based approach that would ensure interoperability with the EU while keeping in mind that Canadian context is unique. For example, AIDA would include flexible compliance options in order to ensure that our many small to medium-sized businesses would not be left behind. The proposed AIDA would represent an opportunity for Canada to lead internationally, would ensure market access for Canadian companies and would uphold Canadian values.
The government launched Canada's digital charter in 2019. Its 10 guiding principles offer a foundation on which to build an innovative and inclusive digital and data-driven economy. Ensuring interoperability, a level playing field, strong enforcement and real accountability are clearly reflected in the digital charter implementation act, 2022.
I can assure colleagues that our approach is pragmatic, principled and meets our trading needs. The bill would provide a consistent, coherent framework that Canadians and stakeholders could rely on. With Bill we would continue to encourage trade and investment and to grow an economy that would extend across provincial and international borders alike.
:
Madam Speaker, data is used for good and data is used for evil. Data is money, data is power and data is knowledge. Data can improve our lives. Data can also harm our lives. Data tells the story of our lives, and our personal data flows globally. The amount of data in the world has doubled since 2020 and is expected to triple by 2025 according to Statista, 2022.
To understand why we need modern privacy rights in the digital world, it is important to understand that businesses have evolved from providing a specific service, like a social network such as Facebook and Twitter or a search engines such as Google or Microsoft to find things, to using data to gather information on individuals and groups, to manage and deploy people's data and to sell their information to others and sell them goods and services.
We have evolved from businesses providing these services for interest to businesses using these services for surveillance on us and making enormous amounts of money on our personal information. As legislators, we must balance the uses of data collection with an individual's right to privacy. It is a delicate balance that Bill aims to address by modernizing our privacy laws.
At the heart of this long overdue revision to our privacy laws must be the rights of the individual. In my view, commercial usage of data under privacy law should be secondary to personal privacy, and should only be focused on how business interests enhance personal needs and how commercial entities protect individual privacy rights. My remarks today will focus on why this legislation falls far short of what individuals, groups and businesses need for a clear legislative framework of data collection and management of personal information in this digital age.
First, Bill is really three bills in one omnibus bill. The first bill would update privacy law. The second bill contains a new semi-judicial body and would potentially duplicate what the Privacy Commissioner could do while removing the right to go to the courts. The third is a rushed bolt-on bill on artificial intelligence that does not, in my mind, have much intelligence in it. The Liberal legislation manages to weaken privacy and put up barriers to innovation at the same time.
Bill fails Canadians right up front in its preamble. Despite demands from privacy advocates over the last few years, the government has failed to recognize privacy as a fundamental right in the preamble. The bill states that individuals' personal information should have the “full enjoyment of fundamental rights”. This is clever language that avoids giving personal privacy the recognition that it is a fundamental right or a fundamental human right.
The wording “full enjoyment of fundamental rights” in the preamble needs to be amended from “of fundamental rights” to “as a fundamental right”. Furthermore, leaving this strictly in the preamble reduces if not eliminates any real legal impact. If privacy is a fundamental right, for it to have true force in this bill it needs to be included as well in clause 5, which notes the purpose of the bill.
Why is privacy a fundamental right? Freedom of thought, freedom of speech and freedom to be left alone are derived from privacy. The legal protections of privacy limit government's intrusion into our lives. In free and democratic societies, we consider these freedoms as essential rights. The rights to think what I want, to say what I want and to be free to choose what I do, what I am interested in and whom I interact with and where I do that in our digital world are data points. To me they are personal information and therefore are part of a fundamental right to privacy.
What does this mean? It means privacy rights under law are prioritized over commercial rights. A rights-based approach serves as an effective check on technology's potential dangers while ensuring businesses can function and thrive.
Government officials have told me this cannot be recognized in the bill the way it needs to be to have true meaning under law and force because it would intrude on provincial jurisdiction. I do not agree, and neither does the Privacy Commissioner of Canada. Both levels of government can regulate privacy and do. The federal government's role is to regulate aspects under its control, including the fact that commerce does not follow provincial boundaries and therefore requires federal oversight.
I believe that most Canadians accept and expect their data to be used to enhance their experiences and needs in our modern society. I also believe that for organizations to obtain the data of Canadians, Canadians must first consent to it, and that if these same organizations find new uses of our data, they need to get express consent as well. Canadians want their data safely protected and not used for things they did not give permission for, and if they choose to end a relationship with a service provider, they want their personal data to be destroyed.
I do not believe Canadians want their personal data sold to other entities without their express consent, and how does Bill deal with these expectations of Canadians? I think poorly. The legislation, in the summary section, states that the dual purpose of the bill is to “govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities.” What it would not do is place personal privacy rights above commercial interests.
The bill would require express consent in clause 15, and that is true, but a great deal of the bill goes on to describe the many ways in which consent would not be required and how it would be left up to the discretion of the organization that has collected the data if it needs consent for its usage. The bill is also weak in terms of making sure individuals understand consent when given. For consent to be meaningful, the usages proposed must be understood. The lack of definition and the placement of burden of interpretation on businesses expose those same businesses to legal action and penalties if they get it wrong. This lack of clarity may stifle innovation in Canada as a result. The bill needs to ensure that individuals understand the nature, purpose and consequences of the collection, use and disclosure of the information to which they are consenting.
In addition, the bill would give organizations the right to use information in new ways and would require businesses to get an update to consent for this information. That is good and necessary, but the bill would also enable organizations to use the implied consent in subclause 15(5). When combined with paragraph 18(2)(d), this would give businesses carte blanche to use implied consent rather than express consent.
An organization can decide on its own that the original consent implies consent for a new purpose, and they do not need to seek the individual's views. This is a version of the old negative option marketing that was outlawed in the 1990s. Either someone gives consent, or they do not. There is no such thing as implied consent, in my view, and this needs to be removed from the bill.
Additionally, the bill uses the term “sensitive information”, which companies and organizations must determine to protect data, but it does not anywhere in the more than 100 pages define what “sensitive information” is. It needs to be defined in the bill to include information revealing racial and ethnic origin, gender identity, sexual orientation and religious and other affiliations. These are just a few examples.
However, that is not the worst of it. Bill would introduce a concept called “legitimate interest”. This is a new rule that would rank an individual's interests and fundamental rights below those of the organization that gathered the information, the exact opposite of what a personal privacy bill should do. To do this, subclause 18(3) would allow an organization or business to use information if it has a legitimate interest in doing so. However, here is where it really gets goofy: To try to reduce businesses using our data under the legitimate interest clause for their own needs over ours, the Liberals have decided to limit the power under paragraph 18(3)(b). This clause could prohibit the business or organization from using our information for the purpose of influencing behaviour.
For more than 20 years, since the invention of loyalty and rewards programs, retailers have used people's data to offer products they might enjoy based on their purchasing patterns. Have members ever bought wine online or in store because it said, “If you like this, you might enjoy this alternative”? Have members ever watched a show on Netflix because it was recommended? Have members ever listened to a song on Spotify because it was recommended based on what else they had listened to? Well, guess what. Paragraph 18(3)(b) could now make this service illegal.
The Liberals cannot get express consent right, and they are allowing companies to use people's data with implied consent or no consent at all. The Liberals are also putting the business use of people's personal data above their privacy rights. That is why it is really the no privacy bill. At the same time, the Liberals are making illegal the good parts of what businesses do in enhancing the customer experience by removing the ability to study purchasing patterns and offering products that we might enjoy because of paragraph 18(3)(b). This bill makes influencing people's decisions illegal.
The said to me and mentioned in the House in his opening speech on the bill, as have other members today, that he is proud to be protecting children from harm in this digital bill. This 100-page legislation has only one clause related to children. Subclause 2(2), under “Definitions”, states that “information of minors is considered to be sensitive”, but the bill does not define “sensitive” nor does it define what a minor is. Officials tell me that the definition of a minor is determined by provincial law, so each province would have different rules, and companies would have to comply with the different rules in every province.
If the protection of children were really a major purpose, this legislation would devote some space to defining both what a minor is and what sensitive information is. During COVID, minors used many online apps and programs to continue their formal education. There were then and still are no protections under law as to what is done with their data. This technology would be a new normal for our education system. The online surveillance of children resulting from the COVID experience is huge and protections are zero, even with this bill.
This bill needs to define in law, not regulation, age-appropriate consent for minors, and comprehensive rules to prevent the collection, manipulation and use of any minor's data. This bill leaves it up to businesses to decide what is sensitive and appropriate for minors. It is a colossal failure on the 's main selling point for this no privacy bill.
The bill is silent on the selling of personal data. It needs provisions on the limits and obligations of data brokers. The bill is silent on the use of facial recognition technology. The bill also prohibits using data in a way that produces significant harm and defines it inadequately. For example, psychological harm caused by a data breach and embarrassment caused by privacy loss are not included. The damages role needs to be expanded to include moral damages, since most contraventions of privacy do not involve provable, quantifiable damages.
Creating more government bureaucracy and growth is the true legacy of the Liberals in government. This bill is no exception, with the creation of a body to appeal the Privacy Commissioner's rulings to. The appointed new body of non-lawyers is called the personal protection and data tribunal, and it is the second part of the bill. Frankly, these powers, if they really are important, should be given to the Privacy Commissioner to eliminate the middle man of bureaucracy. There is no need for this tribunal.
Finally, let us turn to the ill-conceived, poorly structured and ill-defined artificial intelligence part of Bill . It really needs to be removed from this legislation and puts this bill's passage into question. AI is a valid area to legislate, but only with a bill that has a legislative goal. That is why I am hopeful that the Speaker will rule in favour of the NDP's point of order, reiterated by our , which would ensure that part 3 of the bill is voted on separately from part 1 and part 2.
Essentially, this part of Bill would drive all work on AI out of Canada to countries with clearer government legislation. It tells me the government has not done its homework, does not really know what AI is or will become, and has no idea how it will impact people in our country.
The bill asks parliamentarians to pass a law that defines no goals or oversight and would give all future law-making power to the minister through regulation, not even to the Governor in Council but to the minister. The minister can make law, investigate violations, determine guilt and impose penalties without ever going to Parliament, cabinet or any third party.
It is a massive overreach and is anti-democratic in an area critical to Canada's innovation agenda. Promises of consultation in the process of crafting regulations is too little, too late. It puts too much power in the hands of unelected officials and the minister.
The definition in the bill of what AI is, and therefore what it wants total regulatory power over, is a system that autonomously processes date related to human activities using a genetic algorithm, a neural network, machine learning or other networks to make recommendations or predictions. If we think this is futuristic, it is not. It is already happening in warfare to determine and execute bombings.
Without parliamentary oversight, the bill introduces the concept of “high-impact systems”. It does not define what that is, but it will be defined in regulation and managed in regulation. No regulatory power should ever be given to the minister or the Governor in Council for anything that is not defined in law.
The only thing the bill defines is the unprecedented power to rule all over this industry and the fines to those who breach the unwritten regulations. The massive financial and jail penalties that extend down to the developers and the university researchers for undefined breaches of law as part of the statute are huge.
Unless this portion of the bill is separated when members vote, this AI section is reason alone that the bill should be defeated. AI is a significant need, but it needs a proper legislative framework, one that is actually developed with consultation.
I urge all members to read the bill carefully. Current privacy laws need amendment, but the current law is preferable to this ill-defined proposal. The AI bill would drive innovation and business out of Canada's economy, making us less competitive.
It is hard to believe anyone could get this legislation so wrong, especially since this is the second time the Liberals have proposed updating our privacy laws. Without splitting the bill, without having separate votes and without considerable amendments in committee in the first two parts, the bill should be defeated.
I urge all members to consider this seriously in their deliberations as we go on to the many speeches that we will hear. While this is a critical point of updating our personal privacy, the bill, in its current state, does not do it and it gives equal if not greater rights to businesses and organizations than it does to individuals.
:
Madam Speaker, it is an honour today to rise to speak to Bill , the digital charter implementation act.
I think it is important to reflect on how long it has been since we last had an update to legislation regarding the privacy laws that exist around data. The last time was over 20 years ago. Twenty years might not seem like a long time, but when we think about it, 20 years ago Facebook was probably just a program Mark Zuckerberg was working on in his dorm room.
If we think of iPhones, they were pretty much non-existent 20 years ago. Smart phones were out, but they certainly did not have anywhere near the capabilities they do today. So many other technologies we have come to rely on now have been getting smarter over the years. They are acting in different manners and are able to do the work they do because of the data being collected from individual users.
Another great example would be Google. Twenty years ago it was nothing more than literally a search engine. One had to type into the Google form what one was looking for. Sometimes one had to put weird characters or a plus symbol between words in the search terms. It literally was just a table of contents accessing information for people. However, now it is so much more than that. How many of us have, at some point, said to somebody that we would love to get a new air fryer, and then suddenly, the next day or later that day, we see in Google, on Facebook, or whatever it might be, advertisements for air fryers that keep popping up. I am sure that sometimes it is a coincidence, but I know in my experience it seems it happens way too often to be a coincidence.
These are the results of new technologies that are coming along, and in particular AI, that are able to work algorithms and build new ones based on the information being fed into the system. Of course the more information that gets fed in, the smarter the technologies get and the more they are looking to feed off new data that can give them even further precision with respect to advertising and targeting tools at people.
This is not just about selling advertising. AI can also lead to incredible advancements in technology that we otherwise would not have been able to get to, such as advancements in health and the automotive industry. If we think of our vehicles, the big thing now in new cars is the lane-assist feature, which uses technology such as lidar to read signals in the road.
There is technology that, when we enter our passwords to confirm we are human beings, sometimes requires us to pick different things from pictures. When we do that, we are feeding information back into helping those images be properly placed. We are not just confirming that we are human beings; there is an incredible amount of data being used to give better evaluations to various different formulas and equations based on the things we do.
When we think of things like intelligent and autonomous vehicles, which basically drive themselves, 20 years ago would we ever have thought a car could actually drive itself? We are pretty much halfway there. We are at a point where vehicles are able to see and identify roads and know where they need to be, what the hazards are, and what the possible threats are that exist with respect to that drive.
What is more important is that, when I get into my vehicle, drive it around and engage with other vehicles, it is analyzing all of this data and sending that information back to help develop that AI system for intelligent vehicles to make it even better and more predictive. It is not just the data that goes into the AI, but also the data that it can generate and then further feed to the algorithms to make it even better.
It is very obvious that things have changed quite a bit in 20 years. We are nowhere near where we were 20 years ago. We are so much further ahead, but we have to be conscious of what is happening to that data we are submitting. Sometimes, as I mentioned in a previous question, it can be data that is submitted anonymously for the purposes of being used to help algorithms around lidar and self-driving vehicles, for example. At other times it can be data that can be used for commercial, marketing and advertising purposes.
I think of my children. My six-year-old, who is in grade one, is developing his reading quite quickly. Two years ago, even at the age of four, when he would be playing a video game and would not be able to figure out how to get past a certain level, he would walk up to my wife's iPad and basically say, “Hey, Siri, how do I do this?”
Just saying that, I probably set off a bunch of phones to listen to what I am saying, but the point is that we have children who, already at such a young age, are using this technology. I did not grow up being able to say, “Hey, Siri, how do I do this or that?”
What we have to be really concerned about is the development of children and the development of minors, what they are doing and how that can impact them and their privacy. I am very relieved to see there is a big component of this that, in my opinion, aims to ensure the privacy of minors is maintained, even though I have heard the concern or the criticism from some members today that the definition of “minor” needs to be better reflected in the legislation.
I feel as though if it is not known what a minor is, in terms of how it relates to this legislation, then I believe this is something that can be worked out in committee. It is something to which the governing members would be more than welcome, in terms of listening to the discussion around that and why or why not further clarifying the definition is important.
I would like to just back up a second and talk more specifically about the three parts of this bill and what they would do. The summary reads as follows:
Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities.
A consequence of this first part would be to repeal other older pieces of legislation. I think this is absolutely critical, because this goes back to what I have been talking about in terms of how things have changed over the last 20 years. We are now at a place where we really do not know what information we are giving or is being used from us. I realize, as some other colleagues have indicated, 99.9% of the time, we always click that “yes, I accept the terms” without reading the terms and conditions, not knowing exactly how our information is being used and what is actually being linked directly back to us.
Through the consumer privacy protection act, there would be protections in place for the personal information of individuals while, at the same time, really respecting the need to ensure companies can still innovate, because it is important to innovate. It is important to see these technologies do better.
Quite frankly, it is important for me personally, and this will be very selfish of me, that, when I am watching on Netflix a show that I really like, I get recommendations of other shows I might really like. As the member for mentioned earlier, when it comes to Spotify, it is important to me also that, when I start listening to certain music, other music gets suggested to me based on what other people who share similar interests to mine have liked, and how these algorithms end up generating that content for me.
It is important to ensure that companies, if we want them to continue to innovate on these incredible technologies we have, can have access to data. However, it is even more important that they be responsible with respect to that innovation. There has to be the proper balance between privacy and innovation, how people are innovating and how that data is being used.
We have seen examples in recent years, whether in the United States or in Canada, where data that has been collected has been used in a manner not in keeping with how that data was supposed to be used. There has to be a comprehensive act in place that properly identifies how that data is going to be used, because, quite frankly, the last time this legislation was updated, 20 years ago, we had no idea how that data would be used today.
By encouraging responsible innovation and ensuring we have the proper terminology in the legislation, companies would know exactly what they should and should not be doing, how they should be engaging with that data, what they need to do with that data at various times, how to keep it secure and safe and, most importantly, how to maintain the privacy of individuals. It is to the benefit not just of individuals in 2022, or 2023 almost, to have data that is being properly secured. It is also very important and to the benefit of the businesses, so that they know what the rules are and what the playing field is like when it comes to accessing that data.
The second part of this bill, as has been mentioned:
...enacts the Personal Information and Data Protection Tribunal Act, which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act.
This is absolutely critical, because there has to be somewhere people can go to ensure that, if they have a concern from a consumer perspective over the way their data is used and they are not happy with the result from the commissioner, they have an avenue to appeal those decisions. If we do not do that, and we put too much power in the hands of a few individuals, or in this case the Privacy Commissioner under the consumer protection act, if we give all that power and do not have the ability for an appeal mechanism, then we will certainly run into problems down the road. This legislation would help ensure that the commissioner is kept in check, and it would also help consumers have the faith they need to have in terms of accountability when it comes to their data and whether it is being used and maintained in a safe way.
The third part of the bill is the more controversial in terms of whether or not it should be part of this particular legislation or in a separate vote. The summary reads:
Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate the risks of harm and biased output related to high-impact artificial intelligence systems.
That act would provide for public reporting and authorizes the minister to order the production of records related to artificial intelligence systems. The act also would establish prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system in an intentional or reckless way that causes material harm to individuals.
One of the consequences of artificial intelligence, quite frankly, is that if we allow all of this biased information to be fed into the artificial intelligence systems and be used to create and produce results for important algorithms, then we run the risk of those results being biased as well if the inputs are going to be that way. Therefore, ensuring that there are proper measures in place to ensure individuals are not going to be treated in a biased manner is going to require true accountability.
The reality is that artificial intelligence, even in its current form, is very hard to predict. It is very hard to understand exactly when a person is being impacted by something being generated from an artificially intelligent form. Quite often, a lot of the interactions we already have on a day-to-day basis are based on these artificial intelligence features that are using various different inputs in order to determine what we should be doing or how we should be engaging with something.
The reality is that if this is done in a biased manner or in a manner that is intentionally reckless, people might not be aware of that until it is well past the point, so it is important to ensure that we have all of the proper measures in place to protect individuals against those who would try to use artificial intelligence in a manner that would intentionally harm them.
As I come to the conclusion of my remarks, I will go back to what I talked about in the beginning, that artificial intelligence, quite frankly, has a lot of benefits to it. It is going to transform just about everything in our lives: how we interact with individuals, how we interact with technologies, how we are cared for, how we move around by transportation, how we make decisions, as we already know, on what to listen to or what to watch.
It is incredibly important that as this technology develops and artificial intelligence becomes more and more common, we ensure that we are in the driver's seat in terms of understanding what is going into that and making sure we are fully aware of anybody who might be breaking rules as they relate to the use of artificial intelligence. It will become more difficult, quite frankly, as the artificial intelligence forms take on new responsibilities and meanings to create new decisions and outputs, and we must ensure that we are in a position to always be in the driver's seat and have the proper oversight that is required.
I recognize that some concerns have been brought forward today by different members. At first glance, when the member for and others brought forward the concern around the definition of a “minor”, which is not something I thought of when I originally looked at this bill, I can appreciate, especially after hearing his response to my question, why it is necessary to put a proper definition in there. I hope the bill gets to committee and the committee can study some of those important questions so we can keep moving this along.
I certainly do not feel as though we should just be abandoning this bill altogether because we might have concerns about one thing or another. The reality, and what we know for certain, is that things have changed quite a bit in the last 20 years since the legislation was last updated. We need to start working on this now. We need to get it to committee, and the proper studies need to occur at this point so we can properly ensure that individuals' privacy and protection are taken care of as they relate to the three particular parts I talked about today.
:
Madam Speaker, for the average citizen in the digital age, we have entered uncertain times. To almost everyone, at face value, the convenience of our time is remarkable. Access to any piece of information is available at our fingertips. Any item imaginable can seamlessly be ordered and delivered to our doors. Many government services can be processed online instead of in person. Canadians have taken these conveniences for granted for many years now.
The pandemic accelerated our ascent, or descent, depending on who you ask, into the digital age. The inability to leave our homes and the necessity to maintain some rhythm of everyday life played a significant part in that, but around the world, we saw governments taking advantage of the plight of their citizens. Public health was used as a catalyst for implementing methods of tracking and control, and social media platforms, which have been putting a friendly face on exploiting our likes, dislikes and movements for years, continue to develop and implement that technology with little input or say from their millions of users.
Canadians no longer can be sure that their personal information will not be outed, or doxed, to the public if doing so would achieve some certain political objective. We saw that unfold earlier this year with the users of the GiveSendGo platform.
The long-term ramifications of our relationship with the digital economy is something Canadians are beginning to understand. They are now alert to the fact that organizations, companies and government departments operating in Canada today do not face notable consequences for breaking our privacy laws. As lawmakers, it is our responsibility to ensure that Canadians’ privacy is protected and that this protection continues to evolve as threats to our information and anonymity as consumers unrelentingly expands both within and beyond our borders.
That brings me to the bill we are discussing today, Bill . It is another attempt to introduce a digital charter after the previous iteration of the bill, Bill , died on the Order Paper in the last Parliament. My colleagues and I believe that striking the right balance is at the core of the debate on this bill. On the one hand, it seeks to update privacy laws and regulations that have not been modernized since the year 2000 and implemented in 2005. It would be hard to describe the scale of expansion in the digital world over the last 22-year period in a mere 20-minute speech. It is therefore appropriate that a bill in any form, particularly one as long-awaited as Bill C-27, is considered by Parliament to fill the privacy gaps we see in Canada’s modern-day digital economy.
Parliament must also balance the need for modernization of privacy protection with the imperative that our small and medium-sized businesses remain competitive. Many of these businesses sustain themselves through the hard work of two or three employees, or perhaps even just a sole proprietor. We must be sensitive to their concerns, as Canada improves its image as a friendly destination for technology, data and innovation. This is especially true as our economic growth continues to recover from the damaging impact of pandemic lockdowns, crippling taxes that continue to rise and ever-increasing red tape.
That extra layer of red tape may very well be the catalyst for many small businesses to close their operations. No one in the House would like to see a further consolidation of Canadians’ purchasing power in big players such as Amazon and Walmart, which have the infrastructure already in place for these new privacy requirements.
In a digital age, Canadians expect businesses to operate online and invest a certain amount of trust in the receiving end of a transaction to protect their personal information. They expect that it will be used only in ways that are necessary for a transaction to be completed, and nothing more.
In exchange for convenience and expediency, consumers have been willing to compromise their anonymity to a degree, but they expect their government and businesses to match this free flow of information with appropriate safeguards. This is why Bill , and every other bill similar to it, must be carefully scrutinized.
As many of my colleagues have already indicated, this is a large and complex bill, and we believe that its individual components are too important for them to be considered as one part of an omnibus bill.
There are three—