CHAPTER 1: PRIVACY MATTERS: RIGHTS,
VALUES AND ATTITUDES
I've said before that you can have a perfect society and perfect order and perfect control if that is what you want, but what you give up is any vestige of your rights as a free, autonomous, unique human being. We really have to take a hard look at how far we're going to go ...1Bruce Phillips, Privacy Commissioner of Canada
Privacy as a Human Right and Social Value
If we were to isolate two concepts that Canadians presented to us as fundamental in our discussions as we travelled across the country, they would be "dignity" and "autonomy". The well-known privacy advocate, Simon Davies, pointed out to us that privacy is central to both these qualities.2 Bruce Phillips, the Privacy Commissioner of Canada put it another way. "The thing that animates decent societies," he said:
is observance of the principle of fairness: that we treat each other with a reasonable degree of respect and are not going around behind each other's backs with little pieces of information that we can use against each other. That is not the kind of open, transparent, candid society we want to build.3The Canadians we spoke with were clearly committed to building this kind of candid and open society, and argued that privacy, as a core human right, remains essential to the workings of a healthy, meaningful democracy. In the words of Darrell Evans:
I think privacy has to be seen as a basic human right. To me, privacy is an essential part of human freedom. Reading through the case studies, the picture I got was that what freedom is there in a society where those kinds of scenarios can play out?4Many considered privacy as the most fundamental of human rights because its existence encourages us to make use of other rights. Committee member John Godfrey in reporting on the discussions in Montreal summarized the discussion in this way:
Privacy is not a free-standing right but it is often associated with other more established rights, as a sort of associated or pre-conditional right. The right to free assembly can be chilled or damaged by excessive knowledge about you, say through video surveillance. If you know that there are going to be cameras picking you out as an individual, depriving you of your anonymity, that might reduce your inclination to assemble, or indeed, your inclination toward free speech.5Certainly, many speakers at the townhall meetings agreed that any debate about privacy highlights the clash between individual protections and societal protections.6 But at a more fundamental level, Canadians see privacy seen not just as an individual right, but as part of our social or collective value system.7 As we struggled with the impact of new technologies on our understanding of privacy, we realized that, ultimately, we were talking about what kind of society we want for our future.8 Canadians view privacy as far more than the right to be left alone, or to control who knows what about us. It is an essential part of the consensus that enables us not only to define what we do in our own space, but also to determine how we interact with others - either with trust, openness and a sense of freedom, or with distrust, fear and a sense of insecurity. In the words of Committee Vice-Chair Andy Scott, the participants felt that:
Ultimately, this isn't a technical question. Ultimately, this is a question of fundamental values. . . I believe that our obligation as legislators is to somehow reach into the collective wisdom of the country and citizenry and find out what it is that people believe their laws should reflect.9The concept of privacy in today's high tech world has taken on a broader multitude of dimensions than ever before. To some, it is the right to enjoy private space; to others, it is the right to conduct private communications, to be free from surveillance or to respect the sanctity of one's body. However it is defined, privacy, in the words of Committee Chair Sheila Finestone:
. . . is a core human value that goes to the very heart of preserving human dignity and autonomy. It is a precious resource because once lost, whether intentionally or inadvertently, it can never be recaptured.10As we conducted our townhalls, we found - unsurprisingly - that privacy is reflected through many lenses. What emerges is a consensus which consists of a rainbow of values, interests, knowledge and experiences. Nonetheless, we could not but be amazed by the degree of consensus that emerged in each of our meetings. Union members shared a common concern with their managers; workers in the private sector could make common cause with their public sector colleagues; genetic researchers agreed with advocates - they all believe that privacy matters.
Privacy: Paradise Lost?
Many of Canadians' fundamental values - including privacy - are undergoing challenges from the profound socio-economic changes that result from the use of new technologies. In many ways, what separates the privacy debate of today from that of 15 years ago is what several participants in our townhall meetings called our obsession with risk reduction and certainty. The benefits of new technologies are often defined by the economic efficiencies that they introduce. Clearly, there are also societal benefits in reducing street crime, fraud and illness. But too often the debate ends there.In our quest to reduce risk and make society more predictable, we have, as David Lyon argued, "ignored human rights in the most profound sense".11 Just as introducing video surveillance in shopping malls fails to reduce crime, but merely moves it to other places,12 our obsession with risk management leads us to create categories of people which may or may not accurately describe who they, in fact, are.13 For example, our desire to control public funds may lead us to categorize all recipients of social benefits as potential perpetrators of fraud. The possibilities for discrimination based on these categories will have profound implications for the type of society we are building for the future.14 As Committee member Jean Augustine concluded, "We talked about this being the slippery slope, and the need for guidelines and protocols"15 to ensure that the most vulnerable members of our society are not the first victims of the loss of privacy.
Many participants in our meetings also expressed concern about a widespread sense of defeatism and technological determinism, where our collective destiny is perceived to be determined by the kinds of technology we are capable of.16 As Committee member Sharon Hayes reported, participants felt that we will be unable to find the appropriate balance if we "continue to allow technology to be the tail that wags the dog."17 We should, they argued, take control of the process, and determine not only what we can do with these new technologies but what we should do.18
In many ways, the "soul of the issue", as Kate White reported, "seems to be one of trust". Who do we trust to know things about us and take our privacy concerns into account?19 General discomfort, both from a consumer's and an employee's point of view, greeted the idea to leave these issues to the private sector.20 On the other hand, many placed their trust in the government to advocate for the best interests of society.21 But this trust was far from blind. As Marnie McCall reported, the Consumers' Association first made a recommendation regarding privacy protection to the federal government in 1973.22 Ken Rubin, an Ottawa privacy advocate, first made a submission on privacy to a parliamentary committee in 1982.23 And Evert Hoogers told us that his union has been asking for the prohibition of employee monitoring in the workplace for the past 15 years.24
As much as we found a sense of cautious optimism that it was not too late to protect our privacy, we encountered a clear sense of urgency.25 People across the country called on the government to act now, or to risk losing the trust citizens have traditionally placed in our legislators to balance our social good with economic and political goals.
A. Privacy, Power and Community
This sense of balance formed a recurring theme in our discussions. Canadians do not see privacy in isolation or as merely an individual right but as part of the fabric which holds our society together. David Lyon summed it up as a belief:
that we live in a participatory democracy where mutual trust is assured because we deal with each other as people who have disclosed things to each other within those relationships of trust... and that's why it's quite different from a residual question of privacy. It's a social question.26Accordingly, many of our townhall participants tied issues of privacy to questions of power and community.
People feared that attitudes toward privacy issues reflect a legacy of fatalism that George Orwell expressed in his discussion of Big Brother in the novel 1984. We often feel powerless when confronted by new privacy problems and feel that the situation is beyond any control that we might exercise as individuals.27 In Simon Davies' assessment:
The public perception is, - well they know everything anyway; there's no hope; anything I do can ultimately be traced. It's almost as if there's this resignation... that there's nothing you can do. So people tend to opt out completely and just say they'll accept that privacy rights have been eliminated.28Opinion polls tell us that this sense of powerlessness is strongest among people who are poorly educated and those who believe that their personal information has been used in a way that invaded their privacy.29 Our townhall participants felt strongly that the communities least able to resist invasions of privacy, such as people requiring social assistance30 or those who are functionally illiterate31, suffer the first hits by the adoption of new invasive technologies.
We heard many stories illustrating the potential repercussions of this vulnerability. For example, the numerical order of figures on the Social Insurance Card indicates where the card was issued and whether the card holder was an immigrant to Canada. This information, in turn, leads to potential for discrimination by the government and by the private sector.32 In Fredericton, we heard of two pregnant women who faced the possibility of delivering children with disabilities. When they refused to undergo genetic fetal testing, it was strongly recommended that they submit to psychiatric evaluation.33 In Calgary, we discussed the frightening prospect of eugenics and the removal of classes of people from society through selective abortion.34 And in Fredericton, we faced the spectre that discrimination against persons with disabilities that is based on economically-driven, private sector decisions will only grow with greater access to genetic information.35 To ensure an end to this type of discrimination, participants called for governments act immediately to provide vulnerable communities with special protections.
We also need to eliminate the possibility that our sense of responsible citizenship and our `community-midedness' might be undermined by the false impression that technology is taking care of things. For example, witnesses to an accident could come to rely on a video camera recording the relevant details and feel that they had no obligation to report what they had seen. Instead, they would rely on the anonymous person who views the video recording to do the job that a citizen ought to have done.36
The tools we use to protect privacy must be developed within a social context that protects our sense of community. Once again, the privacy prism requires us to evaluate our underlying goals as a society and to take responsibility for the consequences of new technologies.
B. Privacy as a Commodity
As Randy Dickinson pointed out, the use of technology not only affects individuals; it also has an impact on the commercial activity of the community as a whole.37 Many townhall participants feared that privacy has become a commodity that people are prepared to trade off for either a better level of service or product or the minimization of penalties.38 Paul-André Comeau, the Privacy Commissioner of Quebec, warned against a debate about privacy that focused solely on the commercial value of information. This was, he said, "the slippery slope we are lured onto by the new technologies in their attempt at putting a dollar figure to each piece of information."39In large part, this issue grows out of what we earlier called an obsession with risk management by those who administer programmes that involve entitlements or benefits. But there is also a growing commercial imperative which makes these questions increasingly problematic.40 M. Comeau also told us that :
It is dangerous and, at any rate, it could be very harmful for Canadians to see a debate focusing solely on the commercial value of information pertaining to privacy. Of course this information does have a commercial value, but it is first and foremost a question of basic rights.41Many people at our townhalls feared that those who want to violate privacy for their own economic gain exercise too much influence over the nature of privacy legislation42. They argued that we will not find the appropriate balance between privacy rights and efficiency if the process of regulating privacy continues to be driven solely by economic and administrative interests. They were concerned that, left to its own devices, private industry will make choices that affect privacy based on self interest rather than the public good.43 Indeed, the perceived threat to privacy seemed greatest from the private sector44, particularly as the government hands over many of its traditional activities.45
This commericalization of privacy was tied to the question of ownership. As Jean Augustine reported:
Over and over again, I got the message that people were looking for... some strong indication, guideline, policy direction, some way in which we can control who owns information . . and [assert] the individual's right to the ownership of the information.46Participants argued that Canada lacks clear principles and guidelines about who owns information and who can use that information for economic or commercial gain. If individuals own information about themselves, then the ability to consent to sharing that information is an essential part of ensuring that the individual retains control over his or her privacy.
Meaningful Consent
Generally, people saw consent as a primary tool to protect privacy from technological invasion. But participants distinguished between `token consent' and `meaningful consent'. They feared that informed consent becomes an empty concept when people do not know how information about them is being collected47, or are forced into giving consent in order to get something.48In many cases, these fears are justified. In order to get or to keep a job, employees will accept serious invasions of their personal privacy and feel powerless to object. For example, if someone cannot get a job without undergoing a particular genetic or drug test, that person does not have a free choice. The same principle governs those who apply for insurance coverage. As Margaret Somerville pointed out, there is a difference between "mandatory" testing and "compulsory" testing. Compulsory testing creates a firm requirement to take the test, as for example, when it is a condition of continued employment. Mandatory testing has the appearance of being voluntary because one's consent to the testing is required. However, as refusal of consent will lead to denial of services or benefits, the test becomes, in reality, quasi-compulsory.49
Over and over, we heard that this lack of meaningful consent was an issue of prime concern to the participants in our consultations. Again, the issue of balance was raised. Instinctively or knowledgeably, Canadians organize information about themselves in a hierarchy. To illustrate: People consider privacy to be a right but also recognize that in order to participate fully in society, as citizens or as consumers, they must allow others access to and the use of certain types of personal information. They know that information must be exchanged and that emerging technologies can facilitate personal and social interaction that benefits everyone. At the same time, most people want to see technology used under controlled conditions with considerable sensitivity to the human rights aspects of its use.50 In other words, we are seeing the emergence of a demand for ``informational self-determination."51
Each and every one of us is accustomed to requests for, and to providing, certain information about ourselves. Under normal circumstances, we are not particularly sensitive about giving our names or our ages. But we are increasingly cautious about providing home telephone numbers, buying habits and especially financial or health information.52
Many people approve of advanced technologies, especially when they are applied to obtain such community benefits as crime control. In some circumstances, individuals appear to implicitly enter into a voluntary contract by consenting to abridgement of certain privacy rights in return for certain benefits. The problem arises when this contract is extended by those who collect and control information to other things which most people consider to be absolutely private.53 In Margaret Somerville's view "We've got to get over the technological imperative... ``have technology, must use' ". The issue in the minds of most of those who spoke to us was ``How do we decide which technology to use when?".54
Experts have expressed concern that privacy interests are at worst, ignored, and at best, not given sufficient weight in determining the balance between privacy and security or privacy and economic interests. As Marc Rotenberg pointed out ``because there are one or two instances where the technology has aided in public safety, there's little basis [of support] in restraining or slowing the deployment of the technology."55 But such instances do not vitiate the need to safeguard privacy and individual ownership of personal information. We must control the use of personal information through the concept of real and meaningful consent, freely given by an individual who has the power to say no without suffering any adverse consequences.56
PRIMARY AND SECONDARY USES OF PERSONAL INFORMATION
The primary use of technology refers to the purpose for which the technology was developed and/or installed. For example, the primary use of video cameras installed on a main street is to protect the public from crime. Secondary use is a term used to describe what happens when the information collected by the technology is used for purposes other than those envisioned by the developers. The example used in the case studies involved a recording made by a main street video camera of a man attempting suicide in his car. The video enabled the police to call 911 and save the man's life, which was in keeping with the primary use of public safety. However, the video was then sold to the media. The sale did not promote public safety. Rather, it was a secondary use of the recording, clearly outside the primary purpose for which the tape was made.Another theme which was raised throughout the consultation process was the need to restrict or control the uses made of personal information. Participants felt that it was crucial to look at the purpose for which information is collected when determining which uses are or are not appropriate. Many feared that much information being gathered is, indeed, being collected without a specific purpose, and that this should not be tolerated. In addition, they argued that it is essential to test our assumption about the usefulness of the technologies we use to collect information. For example, replacing guards in Ontario jails with video cameras may have cut expenses, by reducing the cost of prisoner surveillance, but it caused a number of other problems in the prison community which far outweighed the initial savings.57
In addition, participants wanted controls over the use of information once it is collected.58 For example, the participants at our townhalls universally considered it unacceptable to sell a videotape of an individual attempting to commit suicide in a public place to the media for public broadcast. Not only was this secondary use of the tape distasteful to the townhall participants; it contravened the implicit contract that street surveillance will only be used to promote public safety.
Similarly, a health worker questioned the uses of video cameras and employee access cards in a New Brunswick hospital. The administration claimed that the technology was in place to protect the employees. However, the cameras pointed not at the public (i.e. those entering the hospital) but at the workers, and the access cards were used to record when employees started and ended their work day, even though their collective agreement prohibited punching a time clock.59
To avoid these problems, both the primary and secondary uses of information gathered by new - and existing - technologies should somehow reflect the reasonable expectations of the individuals about whom the information is being collected. Moreover, the person or organization seeking to use invasive technologies should be required to establish that the precise nature of the common good that justifies the invasion.60
Once again, people called for an appropriate balance. The challenge put by the participants in the townhalls was summarized by John Godfrey in this way:
Our first task is to balance the rights and needs and convenience and security of society against the less convenient nature of human rights, which are always awkward and always difficult, but just simply fundamental.61
The Future is Now
In order to meet this challenge, we must address the growing gap between the rapidity of technological change and the slow evolution of human rights.62 The vast majority of participants did not want to turn the clock back, but, rather as Committee member Sarkis Assadourian noted, "to catch up with technology"63, so we can control and manage it in a way that protects our privacy rights. Randy Dickinson put it this way:
We're not against technology. We're just very concerned that it's used to benefit the community and protect the citizens, and not allowed to be misused and abused by people who don't share the same ethical standards as the people who are here [at the Frederiction townhall] today.64In many ways, our case studies underlined the fact that new technologies have not necessarily created privacy conundrums. People have always used personal information to make decisions about access to goods and services, or to enforce public standards of behaviour. However, the fact that the technology is now so efficient at gathering this information, brings these problems to a whole new level of privacy invasion.
With regard to the specific technologies that our case studies singled out, we heard a considerable amount of discussion about the risks that must be understood before any informed decision can be made about where the appropriate balance should lie.
A. Genetic Testing
The relationship between privacy and genetic testing caused some soul searching among those who came to our meetings. Margaret Somerville told us that:
Genetics requires us to rethink, even reimagine, our assumptions, attitudes, values and beliefs... What we are addressing are the most fundamental, wide-ranging values on which our society is based... We are also adressing - and this is what makes it unusual, because you don't often get these in such close relationship - the most individual, intimate, personal, moral issues.65Generally, people agree that genetic technology has very real and personal benefits in terms of providing medical diagnosis and care. However, both privacy advocates and genetic researchers argued that increasing commercial interest in the information will spur on employers and insurers who can, in fact, already gain access to genetic data by obtaining personal medical files. Accordingly, the potential for misuse of this highly sensitive personal information is very real and has already become a problem.66 In addition, many people base their caveats and cautions expressed to us on their concern that information generated by genetic testing will be misused for purposes that have nothing to do with the medical well-being of the individuals who have undergone the tests.67 Instead, uses will grow out of the thirst of the state and of the private sector for personal information.
There are legitimate reasons for genetic testing. For example, the United States military tests the DNA of its members so if one is killed, the remains may be identified. But problems arise when authorities use this information for secondary purposes, such as the U.S. military's passing it on to law enforcement agencies. In essence, this permits the police to conduct a search that would otherwise require appropriate and direct legal approval.
In addition, genetic information does not only provide information about an individual but also about his or her blood relatives as Committee Vice-Chair Maurice Bernier reported:
When someone goes for genetic testing, that person is not the only one concerned by the results. In other words, not only do we gain information on that person but also on that person's total family. People who have never given any consent might be affected by decisions whose origins they don't even know. This is a serious problem that was emphasized and it should be taken into account.68Given the formidable repercussions these types of choices will have on society as a whole, experts and non-experts alike agreed that genetic information "involves a difference in kind, not just degree".69 Universally, the Canadians who we met during this study called for special measures to ensure that genetic data is used in ways which are consistent with our underlying values.70
B. Smart Cards and Biometric Encryption
The discussion concerning smart cards once again reflected the need to find a balance between convenience and efficiency on the one hand and personal freedom on the other. Smart cards, as opposed to the more common swipe cards used by automated tellers, for instance, contain a computer chip with enough memory to store a great deal of information. The type of information depends on the function of the card. Our townhall participants recognized that smart cards have advantages; they simplify our lives and promote the efficient administration of public and private funds.71 But at the same time, people called for measures to ensure that we will be protected from inappropriate secondary uses,72 and that the technology will only be used by those who genuinely consent to it.73Concerns about secondary uses of the information on a card were strongest when it contained health information. For example, the health card experimentally introduced by the Régie de l'assurance-maladie du Québec in the city of Rimouski contained extremely sensitive medical information including personal and family history, test results and medical diagnoses. As Paul-André Comeau said:
This type of technology obviously raises important questions: can you imagine who could have access to this information? Could, for example, indiscreet eyes see that information, with obviously very serious consequences? For example, what is voluntary pregnancy termination, abortion, was included on the chip and this became known elsewhere? It doesn't take much imagination to foresee the problems this could cause.74The coupling of biometric technology with smart cards raised further concerns about the relationship between the individual and the collective. Biometric technology is based on the collection of data relating to personal characteristics - for instance, fingerprints and handprints. The technology allows that data to be digitized and then encoded on a card or in a database. Institutions such as banks or immigration authorities can then identify an individual by scanning his or her finger or handprint and comparing it with the digitized picture on the card or in the database.
Cards containing digitized handprints are being used, for example, in the CANPASS project. The CANPASS project is a fully automated immigration and passport control system being pioneered by Canada, in conjunction with the United States. Willing individuals allow their handprint to be scanned and encoded on a CANPASS smart card. Immigration and customs officials are then able to use the card to verify the identify of the cardholder when he or she is entering the country. The project aims to replace a substantial number of passports with smart cards in the next 10 years.
As Simon Davies noted, the use of this technology "raises enormous questions of human identity that need to be addressed now".75 However, the technology can be used either to invade privacy or to protect it. It is important to remember that the information cards can be encrypted in such a way that the cardholder has total control over who accesses it. Through the use of encryption, we can still obtain the benefits of fraud-proof identification without necessarily invading privacy.76
C. Video Surveillance
No longer does surveillance technology fall solely within the ability of national security and law enforcement agencies. Users range from banks to corner stores. The technology itself is inexpensive and easy to use, and the security industry that uses it is generally unregulated.77 Accordingly, we find more and more cameras monitoring our movements - from the bank, to the office, to the corner store.Participants felt that this constant monitoring of individuals in public and private places is inconsistent with a free society. Many participants discussed the value of keeping our movements private, not because we have things to hide, but because constant monitoring takes away from our sense of autonomy. Recent advances in surveillance technology have exacerbated the problem. Our understanding of private and public spaces is not in keeping with technologies that can listen in on conversations taking place in cars as they drive by, or peer into buildings over a mile away. People agreed that the reach of surveillance technologies should not exceed our reasonable expectations of privacy, and should be balanced against the value of personal freedom.
People were also uncomfortable with the lack of controls over the use of surveillance technology by the private sector. Our laws have evolved to protect us against invasive surveillance by the state, but the lack of restraints on other organizations was considered unacceptable. Most of the participants were willing to accept some level of surveillance to protect individuals and property from crime. However, they called for strict control over secondary uses of surveillance tapes, and the development of professional standards for the security industry.
People were also shocked to discover that criminal laws prohibiting the interception of private audio communications do not extend to surreptitious video recordings. The band-aid nature of many laws dealing with privacy reinforced the general feeling that we need some sort of comprehensive framework legislation to ensure that the benefits of new technologies do not override our privacy rights without good cause.
Knowledge is Power
The Canadians we spoke with agreed that the only way to achieve the necessary balance between individual and societal rights is through open communication and dialogue.78 To initiate that dialogue, however, we must raise the public's awareness of how technology changes our social relationships. In the words of Darrell Evans:
I think part of the confusion over the privacy debate is that it hasn't been seen as a fundamental one. It's not rooted in a kind of grassroots feeling. We need a definition, or a firmer idea in the public mind, of what privacy really is, what we mean when we say `privacy'.79The Privacy Commissioner of Canada provided more evidence of the public's need for information when he told us of the growth in the number of public inquiries to the Commission as a result of the exploding nature of the information world and the growth of technology.80 Again, scientific sampling by surveys reinforces our observations. Sixty-one percent of those who responded to a 1992 survey indicated that they did not really know where to go if they wanted to deal with an invasion of their privacy. Only one in five had any knowledge of legislation, provincial or federal privacy commissions or private means of redress. Only two percent knew about human rights legislation and less than half of one percent about credit bureaus.81
As we held our townhall meetings, we also observed that the level of discussion and debate depended on the nature of the privacy protections that were in place. Concerns about privacy were highest and knowledge of means of redress was lowest in provinces where there was the least privacy protection. In provinces where there is provincial privacy legislation and protection in place, both experts and laypeople were more easily prepared to define the issues and relate them to the requirements for the future. This was particularly the case in Quebec, which has the highest level of privacy protection in North America.82 In many ways, peoples' attitudes toward privacy reflect their perception of their ability to affect their individual circumstances.
We are convinced that governments and the private sector in Canada must raise the public's awareness of how new technologies are changing our relationships, and initiate an ongoing dialogue between Canadians about the underlying values which fall within the rubric of privacy. Our task requires us to candidly examine these basic values and build a consensus about the kind of society we want for the future. Technology will fulfill its promise only if we, as a society, participate in an informed ethical and policy debate about the importance of privacy as a human right and social value.83 In the words of Maurice Bernier:
In conclusion, I would just like to say that the point on which everyone particularly focus is the absolute need to sensitize Canadians about the emergence and impact of new technologies, and to ensure that they are continually well-informed. Sensitization and information can be considered the key to successfully introducing any new technology.84The value of dialogue and consensus building was brought home to this Committee throughout the consultation process. Although stakeholders came to the table with very different perspectives, it was clear that there is an underlying consensus about the primary importance of privacy, and the need for strong measures to protect it from technological innovations. As Sheila Finestone concluded:
Last, but not least, there was the overall, general sense that there should be a philosophic principle that is value-based, and that the legislation that flows from it needs to be strong and not subject to technological change.85
Crossroads
We are at a crossroads in terms of defining fundamental human values and principles. If there are no forward-looking protections or at least a consciousness on the part of the public put in place soon, it is possible that we will have to "kiss privacy good-bye in the next century."86 In the words of Darrell Evans:
I think the vanishing of privacy would be a victory of materialism over the human spirit. I find it very hard to picture what kind of room there would be for creativity on the part of human beings in such a world. I feel the virtual bars closing in faster and faster in a world like that. We are constantly told it is a more secure world, of course, a more efficient world, a world that catches fraud much better, but to me, that is the victory of bureaucracy over human creativity. An old phrase comes to mind here, that we know the price of everything and the value of nothing...
What is our goal in all this? What do we seek for individuals in this? We want to put individuals in a place of causation rather than being a complete effect of technologies and of a gradual erosion of our privacy. If we are to maintain human freedom, I think that's what we have to do.87This is the task of this report. However, before we outline our vision for a comprehensive system of privacy protection, it is necessary to first examine the protective frameworks which are currently in place.
1
Standing Committee on Human Rights and the Status of Persons with Disabilities, Evidence, Meeting No. 24, p. 5 [Hereinafter cited as Evidence, 24:5]
2
Evidence, 22:14
3
24:17
4
34:16
5
Evidence, 38:21
6
37:26
7
38:26-27, 30-31, 52
8
33:27-28, 40
9
38:55
10
33:3
11
Evidence, 33:20
12
27:21
13
33:27
14
33:32
15
37:12
16
34:30; 37:4, 20, 38; 33:15
17
Evidence, 34:17, 20
18
33:26
19
33:13, 23
20
37:23
21
33:43
22
33:45
23
33:45
24
33:42
25
33:15, 17, 24, 28, 45; 37:22
26
Evidence, 33:40
27
Evidence, 37:14. David Townsend, for example, argued that it is unlikely that individuals will be able to negotiate their
own privacy protections in a technological world.
28
Evidence, 22:22; 22:13
29
Privacy Revealed, p. 4ff. According to this survey, 60% of Canadians feel they have less privacy than they did a
decade ago and 40% deel strongly that their privacy has eroded.
30
Evidence, 33:15
31
37:21
32
39:15-16
33
37:18
34
Evidence, 35:27
35
37:34
36
38:11
37
37:33
38
22:13
39
21:22
40
Evidence, 33:27
41
21:22
42
36:16
43
33:41
44
37:16, 23
45
33:45; 39:41
46
38:69
47
Evidence, 33:28
48
33:25; 34:16; 36:20; 37:16, 21
49
28:18
50
21:21
51
21:4
52
The 1992 survey showed that information about age caused extreme concern from 8.5% of those who responded.
Home phone and name concerned about 24% while 30% worried about buying habits and 44.6% were extremely
concerned about financial information. Evidence, 30:4
53
Evidence, 36:12
54
28:17
55
22:19
56
33:25; 37:16, 21
57
Evidence, 33:16
58
36:37
59
37:45-46
60
33:18
61
Evidence, 37:16
62
37:15
63
34:29
64
37:35
65
Evidence, 28:16
66
28:9
67
28:3-4
68
34:12
69
Evidence, 28:16
70
33:41; 35:25; 36:14, 20
71
28:13
72
36:12
73
34:20
74
21:11
75
Evidence, 22:11
76
29:5
77
37:35-36
78
Evidence, 37:26
79
34:16
80
24:6
81
Privacy Revealed, p. 25ff. The top ten responses regarding legislation or agencies that help Canadians deal with
privacy where: 1. Human Rights Legislation; 2. Access to Information Act; 3. Freedom of Information Act; 4. Privacy
Act; 5. Charter of Rights and Freedoms; 6. Government; 7. Ombudsman; 8. Consumer Protection Act; 9. Privacy
Commissioner; 10. Credit Bureau.
82
Again this is borne out by the findings of Privacy Revealed, p. 27 which shows that Quebec residents are twice as
likely as residents of other provinces to report awareness of privacy-related legislation or agencies (33% compared to
less than 15%). See chapter three for a discussion of the Quebec legislation.
83
Evidence, 33:27-28
84
38:9
85
36:18
86
Evidence, 21:22
87
34:16-17