Technology: Protecting Privacy

By Shannon Doyle and Matthew Streelman


“The technology which was viewed as a great threat to the human
right of privacy doesn 't have to be a great threat. It can also be an
enabler and a facilitator. ”- Stephanie Perrin


Technology, specifically information technology, has been expanding and evolving at an alarming rate. Today governments, corporations and even your next door neighbors have the ability to collect, organize and track copious amounts of very personal and yet very public information. Once digitalized, this data can be hard to recapture and control. Legislation has provided little aid to develop online consumer privacy protection laws, however, there is growing public demand for better privacy protection options. The public is slowly beginning to realize the magnitude of the problem, “computers have elephant memories - big, accurate, and long term”. 1 The more data that is collected on individuals the more they are beginning to realize how vulnerable they are. A simple Google search can provide a lifetime of information on people, where they were born, how much they bought their house for to how fast they ran their last marathon. Technology is obviously aiding in the erosion of personal privacy and society is beginning to awaken to this fact. Fortunately, just as technology is fueling privacy vulnerability, it can be harnessed and used to protect privacy. That is why advances in technology accompanied with increased social demand for more protection will substantially counter the problem of our eroding privacy.
Privacy: A Working Definition
To help understand how privacy is being restored through technology we have to examine and define two key issues, a definition of privacy and whether people believe they are losing it. To show that dwindling privacy is a real issue, a telephone survey of 1,000 adults conducted by the Center for Survey Research & Analysis at the University of Connecticut for the First Amendment Center and American Journalism Review found that 81% of people said that their right to privacy was “essential”. 2 This is a greater number of people from when the poll was conducted in 1997 when only 78% reported privacy to be “essential”. 3 This poll gives us a clear indication that the people of America do in fact value and demand their personal privacy.
The word privacy has taken on different meanings throughout time. Privacy was first defined through law with the Supreme Court’s interpretation of the Fourth Amendment in the 1973 court case, Rowe vs. Wade. 4 The courts ruled based on the idea that everyone has certain inalienable rights; one of those was a “right to privacy”. It is most beneficial to look at a fundamental definition of the word as it is now used and understood in current society. For the sake of consistency and clear understanding throughout the paper we will look at privacy as a combination of two theories. First we look at a theory presented by a prominent faculty member, Michael Boyle of the University of Calgary who specializes in privacy in a technological setting. His theory breaks privacy down into three basic elements:1, 5

  • Solitude: control over one’s interpersonal interactions with other people
  • Confidentiality: control over other people’s access to information about oneself
  • Autonomy: control over what one does, i.e. freedom of will

We can see that the common notion throughout these elements is that of control, specifically over one’s being and the access that others have to it. This idea of control is a key point as a recent poll of over 1,000 adults showed that 79% of people said that “it is extremely important to be in control of who can get personal information”. 6 Control of your privacy can be administered in many ways with specific regard to the use of the internet, including choosing what information about yourself is and isn’t available in a public forum and the validity of this information. This control also applies to information that you did not choose to have available but nonetheless has become so. To clarify, we would define a public forum as one that has a relatively easy, low-impact method in which to participate. Looking at privacy based on this theory is beneficial because it gives us clearly defined terms which we can specifically and easily apply to most situations, including those occurring virtually. Control of information is becoming increasingly important, and having a frame of reference in which to analyze and interpret its effect on or direct link to privacy is critical.
The other theory that we will combine to create a framework for how we view privacy is one created by Tavani. He explains the theory of Restricted Access/Limited Control (RALC) as having three components “the concept of privacy, the justification of privacy, and the management of privacy”. 7 He continues by breaking each of these components down with specific definitions, but for ease of understanding we will give a general synopsis of this theory. It is about the situational aspect of privacy in which the word situation itself is left open to include any number of things such as a physical location or a relationship. The RALC theory defines privacy in terms of “protection from intrusion and information gathering by others, through situations or zones that are established to restrict access.” 8 RALC specifically does not take into account the role of control and how that can have an impact on one’s privacy.
For the reason that RALC does not take into account the aspect of privacy control, we felt it was important to combine these two theories into one working definition. This way it could incorporate both the aspect of the situation and the level of control one has of their privacy in that situation. It is also important to understand that we do not claim to know what the future holds, nor do we think that any one theory can be held as consistent and forever true in the changing environment and growing realm of technology. Here, now is how we will represent the privacy that people are currently demanding.
Privacy- A situational framework in which one has the right to demonstrate various levels of control with regards to who, where, when, and how information about their personal selves can be administered. Also taking into account an inherent level of control that one has over the validity of this information, and the reliability of the situation in which they choose to divulge, alter, or withhold their information.

1 This theory is Boyle’s interpretation of combining Altman’s theory and incorporating elements of Gavison’s theory into his own privacy theory.
A Matter of Ethics: Analyzing the Situation

The main purpose of this paper is to show that losing privacy in the technological realm is an ethical issue that everyone in the new millennium is facing, and to show that this issue is actually being resolved through the mechanics of the capitalistic society in which we live. In order to convey this point, first we need to assess whether, in fact, there is an ethical issue that needs addressing. To approach this issue we will be using an ethical theory based on the principles of social ethics with a particular focus on justice, expanded upon by Thomas Hill. This theory is based on the largely formal, and widely accepted, principle of acting in a way that maximizes the good of all. We will present our issue based on the ideals and definition of privacy stated in this paper; and from that more specifically to privacy in an on-line environment of personal information. This will show that indeed losing one’s personal privacy is increasingly becoming an ethical concern because of how and when it is being done. Hill’s interpretation of this ethical theory of social justice has five principles to analyze a situation against in order to determine its ethical substance and relevance. 9

  • There must be a basic security, meaning a person is free from murder, theft and adultery, in order to find an intrinsic value in the opportunity.
  • A basic principle of honesty is expected from every man, taken to the extent that they are at least representing themselves with the best of intentions.
  • A principle of impartiality suggests that similar cases must be treated similarly and to refrain from favoritism when addressing individual claims.
  • A principle of proportionality in justice is necessary for dealing with dissimilar situations, assuming that the punishment should match the crime.
  • The principle of equality also needs to be recognized, in that, every person should have equal voice and should be treated the same until proven they require different treatment.

The next step is to then evaluate an individual’s privacy in an on-line setting against these five principles to determine its ethical relevance.
First, we look at the principle of basic security. Theft was specifically noted by Hill as a basic security, and identity theft in particular has been an ever increasing threat to on-line users. The next principle to evaluate personal privacy against is honesty with regard specifically how people portray themselves through actions and intentions. This is a unique issue when applied to an on-line setting, as the inherent nature of the environment is virtual, allowing its users to create any reality with minimal accountability as to the validity of their personal representation. This has become an ethical issue because it creates a forum to change and represent oneself in any manner with little to no notice to other users. Next we should evaluate this virtual world against the idea of impartiality. One aspect of people’s privacy when they participate in almost any on-line environment is that of information gathering done by companies with respect to anything from your name and what sites you visit to your financial status and transactions. This information is collected by a magnitude of companies and government entities with the purpose of using favoritism to apply this data in specific manners such as marketing tactics and terrorist profiling. This principle asks that the situation be weighed in an impartial manner, with the outcome then being proportional in weight to what was done. In regards to proportionality we would argue that a person’s privacy can often be revoked with dire consequences after a very small, possibly even unnoticed at the time, act is committed. Take for example a person using a service such as EBay. Millions of transactions occur every day through their payment system of PayPal, most happening with only the desired consequence of receiving your ordered item in the mail. However, through dishonest practices it is possible for individuals to use this engine of PayPal to create separate accounts that are able to contact EBay users through email. If a user then responds to this fraudulent email under the impression that they are going through the necessary steps to get their desired purchase, they can inadvertently give away their bank account information allowing the perpetrators access to all their funds. This seemingly simple act of replying to an email has then set into motion events that lead to their money being stolen, a consequence that far outweighs the user’s initial actions. Lastly we examine a person’s privacy on the internet against the principle of equality. This is harder to evaluate as there are different planes on which to examine equality, such as equality of one person as compared to another or the equality of information exchanged. For arguments sake we will say that it is based on the equality of one person as compared to another. There is an inherent lack of equality when it comes to representation on the internet, as there are certain financial and educational barrier to be able to use the internet in the first place, and additionally the internet is a place where it is possible to project oneself in any manner one chooses, making it impossible to find any real equity between people.
After examining our situation of eroding privacy on the internet against Hill’s ethical theory of justice, it is easy to conclude that there is in fact an ethical dilemma at hand. People are losing their privacy, losing control of the situation and their information in manners that we have proved are unethical. It is also important to now look at how one, and in fact our society on whole, should attempt to address and continue to remedy the problems that arise from such an ethical dilemma. We would argue that solutions are already occurring in our society through the mechanics of the capitalistic system that now exists. The fundamental nature of capitalism is that of supply and demand, when there is demand for a product or service, it will create a method upon which to satisfy that need through the supply of our economy. This is exactly the case when it comes to issues of privacy on the internet.
A Matter of Technology: What does it have to offer?
The overwhelming sentiment towards technology is that with its evolution, people’s personal privacy has increasingly become under attack. With the speed of computers continuously increasing, along with the growing connectivity of the world, this is an understandable feeling. The ease at which information is now able to flow is frightening. “Once information is captured electronically for whatever purpose, it is greased and ready to go for any purpose”. 10 This information can be sliced and diced thousands of times with relative ease and all in a matter of minutes. Not only can one’s personal information be manipulated, but that data can be accessed by millions of people online. A growing number of people are realizing the dangers of this and are beginning to find a solution. In her opening remarks to a Federal Trade Commission workshop concerning the protection of personal information, Stephanie Perrin with Digital Discretion, a privacy consulting firm, notes, “The technology which was viewed as a great threat to the human right of privacy doesn't have to be a great threat. It can also be an enabler and a facilitator”. 11
When you look at a capitalist society the premise for change is that when people want and value change the market works in such a way to bring it about. Thus, in order for privacy-enhancing technologies (PETS) to work, a demand must be present. A current problem for PETS is that a portion of the public does not understand to what extent their personal information is exposed. Education of the public can aid in this aspect but the critical issue is examining whether or not the general public value privacy and to what extent. People today, “want to communicate a fair amount about their identity. They want to be found, in many cases, as much as they sometimes don't want to be found” 12 . This dynamic is shown through social networking websites such as Facebook and MySpace. On these sites, people willingly provide personal information in order to connect with friends and family. PETS face a classic psychological dilemma that could severely slow the rate of consumer adoption. Consumers do not want to give up extra resources now for a seemingly intangible benefit in the future. Understanding the motivation of the consumer is essential in examining the development of PETS but it is even more important in predicting the future development of PETS. As stated by Danny Weitzner of the World Wide Web Consortium, “we have to accommodate and recognize the fact, as we build these systems [and products], that the production of culture requires the exchange of identity. Commerce requires the exchange of identity”. 13
It should be noted that many obstacles above and beyond consumer demand hinder the adoption of PETS. These obstacles include government regulation due to national security concerns and to judiciary rulings regarding the legality of various privacy protection practices. Although these are very substantial obstacles, it is beyond the scope of this paper to address these specific concerns.
When applying Boyle’s three basic elements of privacy to today’s privacy enhancing technology, it is primarily focused on anonymity which is closely related to solitude. These technologies are focused on minimizing the amount of information that can be collected on an individual by disguising and encrypting the actions of that individual. The most basic and most used example of this technology is screen names used in instant messaging programs. Screen names allow for an individual to disguise their identity while still allowing them to interact with other individuals. A more sophisticated example of this technology is a software toolkit called Tor. The goal of Tor is to allow for anonymous communications. This includes anonymous web browsing, email, instant messaging and even web publishing. Tor specializes in protecting people from what is called “traffic analysis.” Traffic analysis can be used by a variety of people with its intent being to find out who is talking to whom over public connections. It can be used to collect and track peoples’ internet behavior. 14 The idea of traffic analysis is a prime example of people losing control of their privacy in situations disproportional to their potential consequences; thus can be classified as unethical by Hill in our analysis. Tor is a great example of the ability of technology to solve several issues of privacy erosion but it serves as a poor example of the monetary potential of these technologies. Tor is a free software download and utilizes donated servers and bandwidth in order to operate. Although this does not bode well for a business, it does aid in the security enhancement of its users. “The variety of people who use Tor is actually part of what makes it so secure. Tor hides you among the other users on the network, so the more populous and diverse the user base for Tor is, the more your anonymity will be protected”. 15 Tor is dealing with the same problems many new technologies struggle with, in order to be effective it requires wide spread use but in order to gain wide spread use it must be effective. Regardless, minimization provides a great framework for understanding the current goals of PETS. We will see that as technology continues to advance, so to do the goals of PETS.
As the need to protect and control one’s privacy has become more of a desirable commodity, companies have begun to expand their technology into other components of Boyle’s three basic elements of privacy. This includes moving away from information minimization, which highlights solitude, and into information transparency which highlights confidentiality and autonomy. These technologies primarily attempt to automate privacy standards and allow consumers to easily find out what information is being collected on them. An example of this technology is seen in the Platform for Privacy Preferences Project (P3P). P3P, “enables websites to express their privacy practices in a standard format that can be retrieved automatically and interpreted easily by user agents”. 16 P3P is responsible for the small lock that can be found in the corner of most web browsers. The lock is closed when the website complies with certain privacy standards and it appears broken when the website does not comply with those standards. The standards are set by the user and can be tightened or relaxed as needed. Along with providing instantaneous feedback on the privacy policies of the website, it also provides a means to access the written policies of the company. The downside of P3P is that it as of now merely acts as an inspector and not as an auditor. P3P can tell you what a company has said it is going to do with your information but it has no way of making sure the company follows through with their own written procedures. The inability of P3P to audit the various websites highlights a violation of the honesty principle in our ethical framework; however it is then giving users increased control over the situation by allowing them to know what the company is or isn’t doing with their information. As the technology continues to evolve and gains the ability to audit websites, it will strengthen the ethical integrity of these various websites. Currently the bottom line is that it still comes down to the integrity of the company. P3P is a great technological advancement but more importantly it marks a noted shift in PETS. This shift is from minimization to transparency and automation. The fact that P3P is operating behind the scenes and does not require user interaction is a major advancement that is sure to be built upon in the future. Again, P3P serves as a good technological example and a poor business model. P3P is a non-profit organization dedicated to increasing consumers’ education. Fortunately P3P has been considered as much of a cultural phenomenon as it has a technological one. P3P has created a unified force that emphasizes the issue of privacy and in turn has driven companies to examine their privacy policies. In many cases, it has driven businesses to create their first privacy policies. P3P has made the companies aware that consumers are becoming more and more concerned about privacy and that they should seriously examine the demands of their customers and how that equates into ways for them to protect and control their personal information.
Looking into the future, it is quite clear that PETS are going to grow in importance and in acceptance through the mechanics of our capitalistic society. It is important that developers continue to monitor consumers’ wants and preferences in order to maximize acceptance and adoption rates. The overwhelming trend in PETS is ease of use through automation. Stephanie Perrin with Digital Discretion makes it loud and clear. She says about future products, “It's got to be easy. It has to have no additional consumer burden, no load. People want it for free. They want it bundled with their products. They don't want to be nickeled and dimed to death”. 17 Consumers will begin seeing PETS integrated into everyday products such as their web browsers. Perhaps there will be an icon that allows you to turn on and off an anonymity feature. Perhaps P3P will evolve into an auditor as well as an inspector. That way, if a site claims one thing and does another, P3P will be able to warn you and or just not allow the site to be displayed. Regardless these technologies have nothing to do but improve and strengthen the unethical erosion of privacy. Currently, PETS act as add-ons to pre-existing infrastructures, there is a strong push to begin constructing PETS into the infrastructure. “Whether we're talking about the traditional PETS that are about minimization, or whether we're talking about technologies like P3P -- technologies based on P3P -- that enhance user control, which enhance transparency and choice, these have got to be built deeply into the infrastructure”. 18 The infrastructure can be the programs developed or it can be the actual architecture of the internet. The benefits of integrating privacy controls into the architecture of programs and the internet would provide the necessary tools for an all encompassing pro-privacy environment.
Conclusion
By mixing the privacy frameworks put forth by Boyle and Tavani, a working definition of what privacy is was created. It was defined as the right of an individual to control one’s interpersonal interactions, to control people’s access to personal information and to control what one does. Our argument then continued by acknowledging that privacy is circumstantial and therefore hard to pinpoint in every situation. It was important to combine multiple theories in order to develop one working definition that incorporated both the aspect of the situation and the aspect of control to make privacy more comprehensive. We continued by applying the situation of personal privacy to Hill’s ethical framework of justice which allowed us to better show the magnitude of the issue and show exactly where the ethical issues are. After setting up this very important groundwork we were able to show how, the supply and demand properties of our capitalistic society is working with little help from legislative bodies to remedy the ethical privacy dilemmas everyday Americans are facing. Through the development of PETS, P3P, and programs such as Tor the market is responding to this ever increasing issue of consumer privacy. While it was argued that these are working to make the internet a safer, more private environment, it is important to note that these are not complete, comprehensive defenses to privacy erosion. These advances are simply the steps that society is taking as a whole towards meeting its own demand, and they will continue to evolve as consumers’ needs and demands evolve, thus perpetuating the cycle of American capitalism.

Work Cited
1 Moor, James H. “Towards a Theory of Privacy in the Information Age.” ACM SIGCAS Computers and Society 27.3 (1997): 27 – 32.
2 “Public Opinion on Privacy.” Electronic Privacy Information Center. 15 May 2006. 5 Apr. 2007 http://www.epic.org/privacy/survey/default.html.
3 “Public Opinion on Privacy.” Electronic Privacy Information Center. 15 May 2006. 5 Apr. 2007 http://www.epic.org/privacy/survey/default.html.
4 Olsen, J., S. ed., 1999. History Dictionary of the 1970’s, Greenwood press, Westport CT
5 Boyle, M. “A Shared Vocabulary for Privacy.” UBICOMP 2003 5th International Conference on Ubiquitous Computing. 12 Oct. 2003. 15 Feb. 2007 http://grouplab.cpsc.ucalgary.ca/papers/2003/03-UbicomWorkshop.Boyle/boyle-ubicomp-w6-privacy.pdf
6 “Public Opinion on Privacy.” Electronic Privacy Information Center. 15 May 2006. 5 Apr. 2007 http://www.epic.org/privacy/survey/default.html.
7 Tavani,, H. T., (2007). Philosophical Theories of Privacy: Implications for an Adequate Online Privacy Policy, Metaphilosophy 38 (1), 1–22.
8 Tavani,, H. T., (2007). Philosophical Theories of Privacy: Implications for an Adequate Online Privacy Policy, Metaphilosophy 38 (1), 1–22.
9 Hill, T., (1956). Ethics in Theory and Practice. Thomas Y. Crowell Comp, New York.
10 Moor, James H. “Towards a Theory of Privacy in the Information Age.” ACM SIGCAS Computers and Society 27.3 (1997): 27 – 32.
11 “Technologies For Protecting Personal Information: The Consumer Experience.” Federal Trade Commission. 14 May 2003. 25 Feb. 2007 http://www.ftc.gov/techworkshop.
12 “Technologies For Protecting Personal Information: The Consumer Experience.” Federal Trade Commission. 14 May 2003. 25 Feb. 2007 http://www.ftc.gov/techworkshop.
13 “Technologies For Protecting Personal Information: The Consumer Experience.” Federal Trade Commission. 14 May 2003. 25 Feb. 2007 http://www.ftc.gov/techworkshop.
14 “Overview.” Tor.ef.org. 22 Aug. 2006. 3 Mar. 2007 http://tor.eff.org/overview.html.en.
15 “Overview.” Tor.ef.org. 22 Aug. 2006. 3 Mar. 2007 http://tor.eff.org/overview.html.en.
16 W3.org. P3P. Retrieved 3 Mar. http://www.w3.org/P3P.
17 “Technologies For Protecting Personal Information: The Consumer Experience.” Federal Trade Commission. 14 May 2003. 25 Feb. 2007 http://www.ftc.gov/techworkshop.
18“Technologies For Protecting Personal Information: The Consumer Experience.” Federal Trade Commission. 14 May 2003. 25 Feb. 2007 http://www.ftc.gov/techworkshop.