:
Welcome, colleagues, to the 61st meeting of the Standing Committee on Access to Information, Privacy and Ethics, for the continued consideration of the study of the Personal Information Protection and Electronic Documents Act, PIPEDA.
I apologize to our witnesses for the committee's tardiness today. We have good reason. We are all just getting here from the House, where House business took us a little longer than anticipated after question period.
We'll get right to it.
We have an hour and forty minutes remaining so we should be able to get through this with no problem.
I am pleased to be joined today by Mr. Robert Watson, president and chief executive officer of the Information Technology Association of Canada. We also have Mr. André Leduc, who is the vice-president of government relations and policy.
From the Consumers Council of Canada, we have Mr. Dennis Hogarth, vice-president.
From the Canadian Chamber of Commerce, we have Scott Smith, who is the director of intellectual property and innovation policy.
Each organization will be given an opportunity to have about 10 minutes for their opening remarks. We'll go in the order in which you were introduced.
From the Information Technology Association of Canada, Mr. Watson.
:
Thank you, Mr. Chair, and honourable members. It's a privilege to be here today to discuss the evolving worlds of technology, data, and privacy on behalf of the Information Technology Association of Canada, ITAC.
ITAC is the national voice of Canada's information and communications technology industry. Canada's ICT industry includes over 37,000 companies generating over 1.1 million jobs directly and indirectly. Beyond this, the ICT industry creates and supplies the goods and services that contribute to a more productive, competitive, and innovative economy and society. In this spirit, we welcome the opportunity to support your research on the evolving privacy environment in Canada.
The Internet has become the most powerful driver of economic growth in human history, outpacing the steam engine and the advent of electricity. Over the past few decades, data has emerged as a valuable commodity with the power to solve complex problems and generate immense benefits and value for organizations, individuals, and society. The Economist publication recently noted that the world's most valuable commodity is no longer oil; it is data. Today, ICT companies in Canada are using data to improve traffic flows, decrease accidents at intersections, detect health risks, improve agricultural yields, and improve the quality of life for all Canadians. We hope this discussion will deliver recommendations that enhance Canada's privacy regime in a way that promotes responsible use of personal data while supporting and enabling data-based innovation that will support the continued growth of Canada's ICT sector.
At the outset, I want to make it abundantly clear that a strong privacy regime, one that maintains the trust of Canadians, is firmly in the business interests of Canada's ICT industry. Maintaining customer trust is critical to businesses, and it has vital importance when a customer trusts a company with their personal information. In an era in which data is the world's most precious commodity, this is true today more than ever. Data, including customers' personal information, is also quickly becoming essential to most business activities, be it for fulfilling customer orders, billing, customer relationships, or supply chain management and marketing. Therefore, PIPEDA is not only consumer legislation; it is also economic legislation. I encourage this committee to factor the significant economic stakes involved into its deliberations as it considers recommending any legislative changes.
Several parties have stressed that PIPEDA is being challenged by emerging technologies and new business models. However, PIPEDA's technology-neutral and principles-based approach was designed to enable it to adapt with the times. It already includes a workable framework for managing many of the challenges associated with emerging technology like data analytics. Provided that PIPEDA is not interpreted in an overly restrictive manner, it can remain an appropriate principles-based framework able to address Canadians' privacy concerns.
Over the past year, ITAC has engaged in consultations conducted by the Office of the Privacy Commissioner, and there are three areas in these consultations on which I would like to provide additional remarks. First is protecting online reputation. Second is modernizing approaches to consent. And third is the question of whether additional enforcement powers should be provided to the Privacy Commissioner.
With regard to online reputation or what is also known as the right to be forgotten, the challenge is the permanence and searchability of any online post and the impacts that regrettable choices or malicious postings can have on a Canadian's offline reputation.
To address these challenges, the OPC has raised the idea of new legislative powers or processes to remove an individual's information from the Internet. ITAC questions whether the new rules are necessary at this time. Rather, ITAC would recommend that the government focus its efforts on educating Canadians, especially young Canadians, about how to interact responsibly online and to think before they post.
We also recommend that the government leverage the existing legal framework to improve its own processes for seeking redress from online libel through the court and make these legal avenues more accessible to the ordinary citizen. ITAC recommends against introducing an EU-style right to be forgotten that forces search engine companies to alter search results based on individual complaints.
Internet businesses have shown themselves willing to remove content in compliance with court orders and legal requirements, but no business should be deputized by the government to have to decide whether to strike the balance between an individual's privacy and freedom of expression. These decisions are best left to the courts.
Number two is consent. There have been considerable discussions about how new technologies like data analytics and the Internet of things make it more challenging for individuals to provide meaningful consent. ITAC strongly supports the technology-neutral, principles-based approach of PIPEDA, but our members find that express consent is an overemphasizing of how PIPEDA is interpreted by the Office of the Privacy Commissioner.
In today's fast-paced Internet and mobile-enabled world, slowing the transfer of information to complete transactions to garner express consent is a practice that has significant limitations for both customers and businesses, including individuals' willingness to read or understand what they are consenting to. By a show of hands, how many members of this committee have read every word of their iTunes privacy statement?
Increased technology complicity also means that differing or multiple organizations may be storing, processing, and analyzing the same data, making it hard to focus, to be fully explained to individuals. There are also situations where unanticipated use of data could be of great benefit to users, but where it may be difficult, if not impossible, to obtain renewed expressions of consent.
With these challenges in mind, ITAC has proposed several changes that we believe will address the challenges of consent while allowing businesses to form, continue to innovate, and generate economic value from data.
First, if express consent is not always a realistic option, frameworks should be put in place to expand implied consent in appropriate situations. Specifically, ITAC recommends a new exemption be introduced to allow for processing of personal information based upon legitimate business interests or purposes that are consistent with those in which consent was originally obtained. PIPEDA already has tools to provide boundaries for these forms of implied consent, such as the reasonable person test under section 5.3, and the OPC can provide additional guidance as required.
ITAC also proposes the exemption to consent for publicly available information be updated. The existing exemptions under PIPEDA regulations, essentially phone book details, are outdated and do not reflect the current landscape of personal information shared in public venues. Building on the time-tested model of PIPEDA itself, we recommend a new principles-based, technology-neutral exemption for publicly available information be developed that is better suited to adapt and evolve over time.
Last, ITAC also suggests that additional enforcement powers for the OPC are not required at this time. Enhanced enforcement powers were provided to the OPC as recently as 2015 through the Digital Privacy Act, and time is needed to test their effectiveness. Under the current framework, there is a tremendous amount the OPC can do to enhance and promote privacy, including through its public education function. Order-making powers could hinder the collaborative relationship that currently exists between industry and OPC and potentially make it more challenging for government and industry to collaborate and co-create solutions in this rapidly evolving field.
I want to thank you again for the opportunity to provide these remarks today, and I look forward to answering any of the questions you may have.
:
Thank you, Mr. Chairman.
I am Dennis Hogarth, the volunteer vice-president of the Consumers Council of Canada. I'd like to say that the council is pleased to contribute to this study.
The Consumers Council is a national not-for-profit organization that supports the protection and strengthening of consumer rights and the awareness of consumer responsibilities. It works with consumers, government, and business for a better marketplace. Consumers have a clear stake in privacy, the implementation of PIPEDA, and any improvements that might be made through this review.
Important issues have been raised during this study. They reflect the need for more clarity in definitions and interpretations in Canadian privacy legislation.
In terms of the emerging electronic environment, by 2020 more than 50 billion Internet devices will be used globally, all developed to collect, analyze, and share data, mainly from consumers. A massive, growing number of data points are collected, often referred to as “big data”.
Consumer data is collected both actively and covertly through search, social media, credit card transactions, and such sites as Amazon, Expedia, and many others. Information is also now collected more passively through seemingly benign devices that report on location, living habits, and personal preferences. Every Internet connection records information about a user. Although data can be disassociated from personal information to prevent a privacy risk, when data is combined into a big data environment and analyzed with sophisticated software, we now know that the identity or profile of specific individuals can be unmasked.
In terms of the personal information risk, privacy laws lag the sophisticated uses of personal information. The accumulation of personal data creates a risk both for organizations holding it and for consumers whose information is stored.
A 2016 study by PricewaterhouseCoopers reported that many organizations still don't fully understand the risks of cybercrime and how to effectively respond to and manage these types of incidents. Issues range from low board-level appreciation of risks to weak controls used by third-party outsource vendors. Whereas consumers once knew what information we provided to organizations and why we provided it, we are now unlikely to know what information is stored about us, where it is stored, and how it is used.
This brings us to the issue of consent. Data analysis techniques grow ever more sophisticated and are now capable of accessing massive data stores. Personal information is collected, matched, and used in so many ways that it seems inconceivable that the current consent models will remain feasible or meaningful. Organizational privacy policies are often complex and one-sided and often lack transparency.
For meaningful consent, consumers need to understand how their data will be used. It is doubtful that consumers will even be able to read and fully understand the policies; yet they must overlook this to participate in an unavoidable electronic world.
A sliding scale for consent has been discussed as a possible solution. Sensitive personal information would require explicit consent, as always, but use of less sensitive information might be subject to implicit consent. To enable such a solution, the definition of sensitive information would need expansion.
Increasingly, privacy protection may turn less on who obtains personal information and more on how it is stored and kept from detrimental use. To mitigate risk, greater controls must be established around organizations that make sophisticated uses of personal information. These organizations need particular oversight to ensure that they use information appropriately.
On the issue of children and privacy, the council agrees that information collected from children under the age of 16 should be prohibited, unless authorized by a legal guardian. However, age is not authenticated easily, and children can fool systems. Without some form of reliable registry system to verify age, controls will be hard to implement without generating new privacy concerns. Regardless, protections for children included in the general data protection regulation, GDPR, should be considered for inclusion in any revisions planned for PIPEDA.
As to the right to be forgotten, where possible and practical, PIPEDA should restrict organizations from retaining personal information that is no longer reasonably required for processing, or where it is outdated or unable to be confirmed as accurate. Reasonable limits should be placed on the retention of certain types of personal information by controller organizations or outside processors.
Big data will create greater difficulty in identifying personal data when consumers make personal information requests of organizations. Equally, it may be difficult to identify what information needs to be deleted. Technical solutions such as meta tagging of data may assist this process, but such systems could be prohibitively costly for smaller organizations to implement.
On the issue of enforcement, organizational focus on privacy has drifted. Therefore, PIPEDA compliance by organizations remains problematic, largely because non-compliance carries minimal risk. The Office of the Privacy Commissioner must have strong, effective enforcement measures and penalties, including punitive fines and other measures for compliance failures.
We believe that a more appropriate model would include an OPC function to review published organizational privacy policies and practices, especially where these organizations are known to make extensive use of personal information. These organizations should be required to register with the OPC, providing a description of how they collect, use, and control personal information.
Periodic compliance reviews should be made against published policies and controls over data. Review results could be posted online so that consumers can know how their information is used. Oversight could be enhanced through a regulatory model that uses independent third parties.
With regard to compliance with EU standards, the GDPR represents the current gold standard for the world and will likely form the basis for future revisions to many national privacy laws and practices. Aligning PIPEDA with GDPR might involve more effort by Canadian organizations, but compliance would provide greater protection for consumers while making Canada more competitive than non-compliant countries such as the United States. In a rapidly evolving electronic world, Canadian companies will benefit over the long run. We therefore recommend that the committee carefully consider steps to ensure that Canadian privacy legislation continues to be accepted by the EU as adequate.
Finally, on consumer privacy rights, consumer privacy rights in Canada are applied inconsistently. The OPC's website refers to the various federal, provincial, and other bodies involved. Legal gaps and overlaps exist that create confusion and will grow as a concern for consumers, who want consistent rules for organizations using their information.
In February 2012, the U.S. White House issued a report that included a consumer privacy bill of rights governing consumer data privacy. While not legally binding on organizations, the report provided appropriate guidance about privacy expectations. The council believes that the clear statement of privacy rights and responsibilities set out in the White House report should be considered for implementation in Canada.
I thank you for the opportunity to make this presentation on behalf of the Consumers Council.
:
Thank you very much, Mr. Chair and members of the committee, for allowing me to come to address you today.
As was said, I represent the Canadian Chamber of Commerce. We are a not-for-profit trade association and are the vital connection between business and government. We have a network of over 450 chambers of commerce across the country. You are probably familiar with one from your own communities. They're all members of the Canadian Chamber of Commerce, which is the umbrella organization. By extension, we represent close to 200,000 businesses across the country, of all sizes and in every single community.
My role at the chamber is intellectual property and innovation policy from the innovation perspective. That's what you're going to hear about from me today with my remarks. You're also going to hear some similar themes to what I think you heard from the other witnesses, so I hope I don't bore you.
We hear a lot about the pervasiveness of big data and about how both governments and companies are collecting information on us. Much of what we hear comes across as negative and invasive. That's unfortunate. Personal data is the core to creating an innovative product line and user experience.
In a 2016 Accenture survey of more than 500 businesses globally, more than three-quarters of the survey respondents said big data provides better and more personalized customer service, and over half of those respondents said it enhances customer loyalty. Others indicated that the information helps them break into new markets, improve target advertising, and build better products. In a nutshell, data enables innovation.
With your indulgence, I'd like to highlight a few examples of why data is so important to innovation and competitiveness.
First, it's about understanding customers. Big data is used to better understand customers, their behaviours, and their preferences. To maintain a competitive edge, companies are moving beyond traditional datasets and using social media and browser logs as well as text analytics and sensor data to get a more complete picture of their customers.
The big objective in many cases is to create predictive models, tailored not to the individual. The information they're collecting, yes, is about individuals, but they don't really care about the individual information. It's about the collective; it's about the large balance of information that they're collecting to identify patterns of behaviour.
A good example of this might be the use of data by ski resorts. Radio frequency identification device, RFID, tags are inserted into lift tickets. They can cut back on fraud and wait times at the lifts as well as help ski resorts understand traffic patterns, which lifts and runs are most popular, at which times of day, and even help track the movements of an individual skier, if he or she were to become lost. All of this benefits the customer by making the experience more seamless. I know I'd be happy if I got a text telling me there was two feet of fresh powder on my favourite run, even though my employer might not be so pleased that I disappeared for the day.
The second theme is optimizing business processes.
Big data is also increasingly used to optimize business processes. Retailers are able to optimize their stock based on predictions generated from social media data, from web search trends, and from weather forecasts. Employers are able to optimize work flow by monitoring patterns of behaviour and adjusting processes wherever those behaviour patterns demonstrate high productivity.
Next is personal quantification.
We can now benefit from the data generated from wearables. How many of you have a Fitbit? I see one hand, just for the record.
It collects data on our calorie consumption, activity levels, and sleep patterns. While it gives individuals rich insight, the real value is in analyzing the collective data. Analyzing the decades-worth of sleep data in a single night that's collected will bring entirely new insights that can feed back to individual users.
The same is true in life sciences. Clinical trials of the future won't be limited to by sample sizes but can potentially include everyone.
While big data is used to enable law enforcement, it is also used by our financial institutions. Credit card companies monitor behaviour patterns. When those patterns deviate from predicted norms, customers are notified, which helps prevent fraud and identity theft.
PIPEDA predates social media, it predates video streaming, and it predates the notion of ransomware, which we all heard about this past week; yet it has done a pretty good job of remaining relevant as technology has evolved.
As principled legislation, the need for government action to react to technological change hasn't been necessary. Judicial oversight has proven time and again to be an adequate recourse where an organization has stepped outside the boundary of reasonable use of data.
Notwithstanding, significant changes were made to PIPEDA in 2015. Legislative change on something as ubiquitous as privacy legislation will always have a profound impact on business that results from the uncertainty these changes introduce to the economy. Some of those changes introduced in 2015 are not even yet in effect. We're still waiting for the details on how companies will be expected to comply with the breach notification requirements and the keeping of records indefinitely on all of those breaches. We don't really understand right now what that's going to mean. While the clarification to the definition of consent did little more than recognize a common best practice by making that change, it did cause some consternation in the business community as to what the change was attempting to accomplish at the time.
Although we need to monitor what happens in other jurisdictions to ensure our laws are compatible with our trading partners, to ensure the free flow of data and the ability to innovate, doing so preemptively could have unintended consequences. For instance, changes to the general data protection regulation in Europe are imminent, and equivalency in Canada might be put to the test. However, we must understand that the GDPR is much broader than just privacy. It's as much about the public sector and security as it is about privacy.
For instance, a comment was made about the U.S. and the U.S. surveillance. That is a factor when we're dealing with the GDPR. It's a lot more than just our privacy legislation.
Tightening controls on the collection, use, and disclosure of personal information will not likely have a positive impact on privacy protection. The manner in which information is collected and the business model that information collection is built on makes tighter controls untenable, and we're talking about basic behaviour. Trying to create a consent model around behaviour is next to impossible.
Sharing personal information requires trust. Maintaining that trust requires digital responsibility best practices, and to name a few of those: ensure personal data management meets consumer expectations; show transparency in how personal information is sourced; give people more control over their data; explain the benefits consumers earn from sharing information; and use data for social improvement.
The companies that embrace these best practices will be the ones to prosper as new technology such as blockchain evolves that will put control of personal information back in the hands of the individual.
While this past weekend's WannaCry ransomware attack may not have been focused on personal information, it is certainly a global wake-up call regarding the vulnerability of the digital economy. That means we also need a more robust response to cybersecurity concerns.
I'll give you a couple of recent statistics. In the third quarter of 2016 alone, 18 million new malware samples were captured. More than 4,000 ransomware attacks have occurred every day since the beginning of 2016. The amount of phishing emails containing a form of ransomware grew to 97.25% during the third quarter of 2016, which was up from 92% in the first quarter of 2016. Although 78% of people claim to be aware of the risks of unknown links in emails, they click anyway.
The data that's collected, stored, and used by organizations is extremely valuable. Some of that value is yet to be conceived, but governments and organizations alike are vulnerable to attack and I would argue that resources would be better used in international collaboration to target the criminal enterprises attacking databases rather than monitoring the organizations that are innovating and serving customers.
With that I will conclude my remarks. Thank you for your attention.
:
There's little question. An example coming out of the EU is the cookies example. Every website that you have in Europe has a warning that pops up first.
I'm not sure anybody is more or less protected by this policy. It's burdensome for companies and it's burdensome for the consumers who, I would venture to guess, 99.99% of the time when visiting a website will click through and allow cookies to come through on the website so that they can get the information they're looking for.
Is this type of regulation really doing anything, then? We talked about whether anybody has ever spent the time to read through the privacy policies that you see posted on a website, or do you just click through very quickly so that you can get to what you need to get done? Consumers in this day and age are always just clicking through.
There's also a system of checks and balances built into privacy legislation. It is not in the best interest of a private sector company to abuse the personal information of their own customers or clients. You can talk to T.J. Maxx, you can talk to Home Depot, you can talk to Target about the implications of having a significant data breach. Those companies were the victims of a data breach, of hackers getting into their system and accessing the personal information of their customers. They're being victimized, and they're doubly victimized by it by having a number of consumers.... For the larger businesses, that's great; they'll survive. For a Canadian SME.... You'll lose half your customers. That's usually an end-of-life incident.
It is, then, in the best interests of the businesses when they're collecting the information.... You can see how valuable it is now. As we point out, it is the new oil. There's a very high level of value for it, and protecting and storing that information and being able to analyze it is in the best interest of these private sector entities.
:
There were updates in the Digital Privacy Act in order to focus on the protection of minors—not the guys with hats who live in caves, but the children who we have to deal with—and it has to be a balanced approach.
Robert pointed out that we need better education. This is the advent of the Internet. It's a really big thing. Whether it's the school systems, the parents, or the community groups, we need to be educating kids about the potential dangers.
When you're dealing with something like Fitbit, where it's tracking your heart rate and everything, there isn't a lot of danger there. What we're talking about on the big data side—it's really exciting—is that maybe they'll be able to notify you by a text message half an hour before you have your heart attack. That's where we're heading. That's where big data analytics is going.
In terms of protecting minors, it's very difficult to put the onus on the company that is collecting that information, other than asking you if you are under the age of 18, under the age of 19, or under the age of 21, and saying that if you are, you have to get the consent of your parents in order to fill in that information.
Beyond that, there isn't a lot there. How many 14-year-olds would go to their parents to get the okay to fill in the information on the Fitbit? How many parents would go, “Would you just leave me alone?”
Again, I know that I keep reiterating the same point, but when you look at the reasonable use, the reasonable connection, and a reasonable person test for evaluating what is okay and what isn't, you see that it's a lot easier than trying to regulate a consent regime that maybe doesn't really have any value to it. You're not really getting informed or educated consent, and you can't really tell the age of the person you're collecting from, because I would venture to say that most 14-year-olds would ignore that fact and say, “Oh, it won't let me if I'm 14, so I'll just click on 18, and then I'll get through.”
:
Thank you very much, Mr. Long. I appreciate that.
I'll take the round for the Conservatives for the next five minutes, if that's okay with my colleagues.
As a former IT professional, I understand completely what you're saying when you say that data is the most valuable corporate asset. That's been the way of the information age for quite some time, and now, as you've said, data is becoming more valuable than oil, which is interesting.
Mr. Smith, I'm going to you, because I'm going to follow up on what Mr. Long's question was. Data is becoming very, very useful. Actually, it's information that is more useful. Data is raw facts, whereas information is actually coalesced information that's of value and is of use.
Here's my question for you, Mr. Smith. You have been very clear that it's the data, the de-identified data that predicts trends and so on, that a particular user or group of users in a certain age group—or a certain whatever—might be interested in, so that we can have predictive modelling for the purposes of sales and business. I don't think most people have a problem with that.
I actually like the fact that my iPad from time to time knows what I'm thinking more than I do. That's okay, but for a Fitbit, what about the fact that if a Fitbit and its information about sleep patterns, a resting heart rate, and any other health information gleaned from that Fitbit were to get into the hands of a prospective employer prior to an interview? What if it wasn't de-identified, we actually knew who that individual was, and it became an issue, much like the genetic discrimination bill that we just passed in Parliament? What if it became an issue that was keeping somebody from getting a prospective job? Perhaps that Fitbit is measuring their weight and other habits they have that might predispose somebody to prejudice when that person is applying for a job.
I would be interested to see what the point of view might be from Mr. Hogarth and Mr. Smith on this.
:
Fair enough, I appreciate that.
I have a question for Mr. Watson or Mr. Leduc.
When it comes to the threshold for compliance, monetary penalties, we talked about how it's different.
Mr. Leduc, or maybe it was Mr. Watson...I think you said it would be okay for Target, that they'd survive. Target is going to survive because they're a large enough company, but a small or medium-sized enterprise might not survive if their data is breached and there were monetary penalties associated with it through any changes that this committee might recommend in the legislation.
Should there be a threshold? I'm not much for arbitrary lines in the sand when it comes to legislation, but should there be a threshold, so that companies that are small and don't necessarily have a privacy person appointed...?
I mean, I had my own IT company before I did this. I was a one-man shop. I was my own privacy consultant in my company. What do we do for those smaller companies? Should we have an exemption so that those companies would be not affected in the same way as a larger corporation, or is there an inequity and unfairness inherent in that?