Law in the Internet Society

View   r1
AndreaRuedasFirstEssay 1 - 20 Oct 2024 - Main.AndreaRuedas
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstEssay"

Who Does Surveillance Impact the Most?

-- By AndreaRuedas - 20 Oct 2024

From photographic tracking to fraud alerts to risk assessment in prisons, technologies which use algorithms as decision-makers for life altering situations are continuously increasing in use. Virginia Eubanks’ discussion of how racism and classism plague automated welfare systems demonstrates that surveillance systems and modern technology negatively infiltrates the lives of those most marginalized (2017). Her findings point to the intersectional oppressions which can be and are perpetuated by contemporary processes of surveillance because they are based on prejudiced datasets that ultimately impact systems in everyday technologies. To understand how people are continuously marginalized by modern surveillance technologies, we must understand the intent behind the creation of discipline and control systems, the lack of agency given to living individuals in algorithmic decision processes and the lack of access to corrective measures and its impact on current oppression.

The Progression of Systems of Power and Control

In the 18th century, there was a shift from sovereign to disciplinary power - one that focused on order, observation, and hierarchies (Foucault 198). While its first use was to control the spread of the Black plague, disciplined systems became a mechanism of power centered “around the abnormal individual, to brand him and to alter him” (199). These were the beginnings of disciplinary institutions, such as mental asylums and prisons, which were based on targeting and reforming the “abnormal” while marking and continuing their exclusion (200). Surveillance mechanisms were created and used to discipline and exclude - biased towards those who did not fit the status quo.

In the present, we live in “societies of control” where disciplinary power interacts with control power - a control based on numerical classifications that is everywhere, fluid and seemingly ever-lasting (Deleuze 4-6). This type of control uses current and previous information to haunt a person forever because it creates dividuation in which there is not one holistic, humane representation of a person but many digital versions that can be based on real, yet singular moments as much as on false assumptions or biases in datasets (6). These systems of control permeate almost every aspect of our public and private life which is facilitated by their use of technological algorithms that sort, classify, and judge. For example, prisons, a disciplinary invention that already targets the poor and disabled, the “abnormal,” combine with control systems, such as risk assessments to create what Deleuze calls a “new monster” that exacerbates previous discrimination to result in racialized and classist mass incarceration.

As described by Solove, one of the main concerns with modern surveillance is about its use of information processing for discipline and control; mechanisms that Deleuze and Foucault hypothesized were based on subjugation of undesired, undervalued groups (2015). Solove identifies the main issues with data processes that place already marginalized identities at a higher risk of being negatively affected. Aggregation, which strips privacy and autonomy away from individuals. Exclusion, which denies people the knowledge of data use and the ability to correct if wrong, which is already extremely difficult without the appropriate resources and mobility. And last, secondary use that distributes personal information without consent and combines with distortion, to potentially create false representations of the self.

The Monster Blinds Us and Ties Our Hands

These four issues contribute to a Kafkaesque reality in which data collection and technological implementation is dangerous because of its known existence but mysterious usage. The collective ignorance behind collection methods and applications lead to preventional changes in our thoughts and behaviors - without being aware of what is known about us and how it will be used, we are afraid to assert our civil rights and do not demand accountability. This curb in our desire to advocate for human and civil rights, allows for current and past injustices to continue.

A Case Study of the Automation of Welfare

Eubanks’ analysis of the demise of the public benefits system in the United States, concurs with Foucault, Deleuze, and Solovan: “We all inhabit this new regime of digital data, but we don’t all experience it in the same way” (Eubanks 5). The discipline and control systems outlined by Foucault and Deleuze impact marginalized people in one of the four modern ways that Solovan defined. Eubanks further claims that this ultimately impacts particular groups, such as poor people, people of color, and women, the most. Why? Because they encounter more data collection systems within their lives - international borders, welfare systems, and highly policed neighborhoods. Access to basic human needs such as housing, food, or healthcare becomes more complicated to attain for groups who exist at the margins of society and now have to interact with systems designed to exclude.

The events at Indiana, in which the automation of welfare made it impossible to continue receiving benefits and barred recipients from fixing algorithmic mistakes, make it clear that low-income communities of color “are targeted by new tools of digital poverty management” (Eubanks 11). While we could excuse these impacts as unintended consequences, they are not and have never been unforeseeable. History points to a long tale of the surveillance of poor, disabled, of color bodies through non-technological and modern systems which act as “forces for control, manipulation, and punishment” (9). The lack of accessibility to automated systems only complicates the ability of marginalized people to self-advocate. In the Indiana welfare system, people could not find someone to help them regain benefits and were left without medication, without food, and without housing, showcasing that when we give all the power to technology and not to people, we strip our society of the ability to protest and revert its effects.

Sources Cited

Deleuze, Gilles. 1992. “Postscript on the Societies of Control.” October 59: 3-7.

Eubanks, Virginia. 2017. Introduction and Chapter 2 in Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St Martin’s Press.

Foucault, Michel. 1975. Chapter 3 in Discipline and Punish: The Birth of the Prison. Vintage.

Solove, Daniel J. 2015. “Why Privacy Matters Even if You Have ‘Nothing to Hide.” The Chronicle of Higher Education.


Revision 1r1 - 20 Oct 2024 - 22:21:07 - AndreaRuedas
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM