|
META TOPICPARENT | name="SecondPaper" |
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted. | |
> > | The Erosion of Privacy
-- By ArtCavazosJr - 14 May 2010
A few years ago, when confronted with actual evidence that the U.S. government was spying on its citizens, many people didn’t seem to care. In fact, it was quite common to hear people say, “Why should I care? I have nothing to hide.” Others said, “Even if they have embarrassing info about me, it will probably never be seen by human eyes, and if it is, they won’t even know who I am. Why would they care, and why should I?” Those concerned with terrorism, patriotism or security might say, “Plus, our collective interest in (if not right to) national security is far more important than any negligible privacy interest that law-abiding citizens may have. Only criminals and terrorists have anything to fear.” If the government, corporations, or others are collecting data on us, why should we care? Are there good enough reasons, like security or the economy, for them to do so? And if we should care, should we try to stop it?
Why Should We Care?
Loss of Personal Autonomy
Most people conjure up ideas of George Orwell’s 1984 when they think about government infringements on privacy rights. This is the “big brother” model, whereby our freedoms and autonomy are eroded by a large centralized force. While this may well be a danger, it has lost much of its impact with large subsets of society, particularly the “why should I care?” set. Daniel Solove argues for another analogy, Franz Kafka’s The Trial. When thinking about privacy harm in terms of Orwell (top-down social control and inhibitors), the “I have nothing to hide” camp is probably correct, at least for now, that most law-abiding citizens won’t be immediately and directly affected by this type of law enforcement monitoring. However, when thought of in terms of Kafka (bureaucracy uses people’s data to make important decisions about them, but excludes them from the process), the multitude of even “benign” data available suddenly becomes far more interesting. If this data is being used to analyze trends, and make predictions about behavior, there may be some danger that our autonomy will be greatly eroded by policy decisions utilizing this type of information. I won't speculate as to whether this kind of paternalistic control would tend towards utilization for purposes sinister or benign, or argue here whether such control is undesirable in and of itself; but most important is that in all scenarios involving such widespread and invasive data aggregation and analysis, our personal freedoms lose out.
Loss of Security
In addition to monitoring and control problems, there is the so-called “privacy theater” problem. As early as 1997, Dr. Latanya Sweeny, a computer scientist and professor at Carnegie Mellon, showed that seemingly benign data (birth date, zip code, gender, race, marital status, etc.) could be used to re-identify what was previously thought to be anonymous data. The process works on simple theory: “anonymous” data is usually missing so-called personally identifiable information (PII) like names, social security numbers, etc., but if you can find another database with that type of information (say voter registration rolls, which can be purchased in most states for $20 or less) you can cross-reference the two, and by repeating the process you can eventually have a “complete” data set. In practice of course, this can often be a bit more difficult, and for years, legislation and “security experts” assumed that by deleting PII they were denying adversaries the information they needed to complete data sets of, say, medical records, financial statements, email accounts, and the like. Unfortunately, in the modern era, assuming someone can’t find a specific piece of information is a very poor assumption indeed. Privacy theater claims our security measures and legislation focusing on PII for protecting privacy merely let us go through the motions without really protecting anything.
By assembling fragmented data into complete data sets like a jigsaw puzzle, it is possible to create and maintain potentially enormous databases full of not only sensitive information but also seemingly benign data such as tastes, preferences and other obscure information. This type of information can be used to bypass password recovery mechanisms for email accounts, bank accounts, federal student loan accounts, social networking sites, etc. Endless personal data, including email correspondence, medical diagnoses, drug prescriptions, internet search history, etc. is housed on countless databases, and is potentially available to almost anyone (it might be “anonymized” and released for research, lost through security leaks and attacks, or simply gathered and utilized by ISPs and companies like Google, Apple and Amazon, or sold to so-called fourth-party companies to be data mined for profit). The ease with which this information can be uniquely identified is startling, and the potential of seemingly benign data to cause harm is real.
Should We Try to Stop It?
Collectively, we have all already lost more of our privacy than we care to recognize. Perhaps this will be disastrous, or perhaps we will simply emerge into a new society that will inarguably be forced to confront issues and topics we’d just as soon leave alone. Will the government be watching us all like big brother, or will it simply make all of our choices for us? Will companies continue to find newer and better ways to exploit our secrets for their pecuniary gain? Or will we all carry smart phones to snap pictures of friends and complete strangers, and, using the same face-recognition technology that already auto-tags your friends, have entire dossiers of their personal information at our fingertips?
As long as technology continues to advance, and people continue to subscribe to the “always connected” culture of Facebook, cell phones, GPS, wireless internet, AIM and G-Chat, privacy will continue to erode. This is the fundamental shift in society of our time, and there is no real way to keep it from happening. Throwing away your computer or deleting all your online accounts won’t change anything. What I'm less sure of is the extent and reversibility of the impact our connected culture will have on the ecology of software development, and (sorry Eben) I don't know enough to say whether free software or any other potential "magic bullet," is the elixir that could solve all our problems. But what can, and should happen is the renewed protection of rights we have always - until recently it seems - held dear, and greater accountability of corporate players. Freedom, autonomy and liberty will require new protections, and egregious infringements of rights should never be tolerated. The “Myspace Generation” that entered adolescence knowing nothing else is already here, and the trend towards greater openness and interconnectedness will continue. Where it takes us requires our utmost attention, but it is futile to try stop or reverse this trend. Instead, we should work on alleviating the adverse social impacts of our shiny new technologies by reigning in the big players and educating the average net denizen rather than retreat from such technologies like modern hermits or decry them like would-be iconoclasts.
We all have a deep intuitive sense of the normative value of privacy, but it will continue to be eroded in the pursuit of progress. Employers will utilize ever more invasive background checks, law-enforcement will take advantage of new resources for catching criminals and terrorists, and society will applaud the advances like it did with everything else that has made our lives “better,” or “improved" our quality of life. The only question now is, how much lost privacy will we tolerate, and how will we strike the balance in a new world of readily accessible intimate information? We simply can’t stop the erosion of privacy entirely (perhaps too much has eroded already for that, and people have proven all too willing to accept egregious terms and practices - like those of Facebook and Gmail - as a cost of doing business, or of being a part of "internet society"), but we should ensure it occurs on our terms from now on, something most of us have not been doing when we point and click "agree," and grunt with satisfaction as our apps download and our personal liberties disappear.
| | The Erosion of Privacy |
|