|
> > |
META TOPICPARENT | name="FirstEssay" |
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.
Financial Privacy, Digital Redlining, and Restoring the Commons
-- By RaulCarrillo? - 17 Dec 2014
Introduction
We have spent a substantial amount of time in this course discussing invasions of privacy, including in the sphere of personal finance. As we know, data mining is not a new phenomenon. Even the mainstream media has offered reports on how information harvesting is employed to barrage potential customers with advertisements. Purportedly, there is a trade-off between privacy and the convenience of tailored consumer choices.
Even when Internet surveillance offers us with an ostensibly better menu of “services”, the process constitutes an affront on privacy rights. Yet it is even worse when personal information, especially financial information, is harvested to overtly punish the people it is harvested from. In the case of financial services, we now live in a world where data is siphoned and subsequently used to entrench even the socioeconomic inequalities financial services purport to render more equitable.
Algorithms & Legalized Discrimination
In a piece earlier this spring entitled “Redlining for the 21st Century”, Bill Davidow of The Atlantic skimmed the surface of how private companies, especially in the finance, insurance, and real estate sectors are using algorithms that charge particular people more for financial products based on their race, sex, and where they live. For example, algorithms used by mortgage companies will use big data to determine a customer’s ZIP Code through their IP Address, and then proceed to charge them on a higher rate based on what neighborhood they live in. If a human employee did this rather an algorithm – supposedly divorced from human manipulation—that individual would be acting in clear violation of The Fair Housing Act of 1968. Yet the algorithm and thus the practice are permitted.
University of Maryland Law Professors Danielle Keats Citron and Frank Pasquale, author of The Black Box Society, have recently detailed how credit scoring systems have always been “inevitably subjective and value-laden,” yet seemingly “incontestable by the apparent simplicity of [a] single figure.” Although many of the algorithms in question were initially built to eliminate discriminatory practices, credit scoring systems in particular can only be as free from bias as the software behind them. Thus the biases and values of system developers and software programmers are embedded into each and every step of development.
Although algorithms may place a low score on “occupations like migratory work or low-paying service jobs” or consider residents of certain neighborhoods to be less creditworthy, the law does not require credit bureaus to reveal the way they convert data into a score. That process is a trade secret, as we have discussed in class. Although the majority of people negatively impacted by the structure of an algorithm may be certain groups of minorities, the process is almost entirely immune from scrutiny. Title VII of the Civil Rights Act of 1964 has been deemed “largely ill equipped” to address the discrimination that results from data mining. As mathematician and former financier Cathy O’Neil has written, whether or not discriminatory data mining is intentional or not is a moot point. Seemingly innocent choices can have a disparate impact upon protected classes.
In true neoliberal fashion, abuse of algorithms for discriminatory practices is not limited to private companies; governments also act similarly in the realm of public finance. In an essay entitled “Big Data and Human Rights”, Virginia Eubanks, an Associate Professor of Women’s, Gender, and Sexuality Studies at SUNY Albany, notes that “the use of large electronic datasets to track, control and discipline U.S. citizens has a long history, going back at least thirty years.” The National Criminal Information Center (NCIC) and New York’s Welfare Management System (WMS), for example, initially utilized algorithms to expose the discrimination of its own employees. However, faced with the fiscal burden of supplying benefits in the wake of recession, the state of New York commissioned expansive technologies that supplanted the decisions of social workers, granting bureaucracies the ability to deny benefits behind the auspices of a natural decision maker. Computers currently make choices about social spending based on “time worn, race and class motivated assumptions about welfare recipients: they are lazy and must be “prodded” into contributing to their own support, they are prone to fraud, and they are a burden to society unless repeatedly discouraged from claiming their entitlements.
Restoring the Commons
In a collaborative essay for The Morningside Muckraker, I recently wrote about how free software and free culture advocates could benefit from updated understandings of the architecture of public finance, particularly the Modern Money (MM) paradigm. According to this school of thought, money, like data, can be created at zero-marginal cost. We have moved from a political economy of scarcity to a political economy of abundance. This realization—emanating from the legal fact of monetary sovereignty—that the Federal Reserve creates dollars with keystrokes, that the U.S. government, unlike like a state or a household, can’t possibly “go broke”, that Uncle Sam has to worry about inflation but doesn’t need to tax or borrow to spend—in turn renders a new framework for considering how a society could economically support a New Intellectual Commons, as Professor Moglen has called it.
By the same token, practices like financial privacy invasion and digital redlining highlight why economic justice advocates cannot forego an understanding of “Law and the Internet Society.” If we do not understand the technological infrastructure, we will always be one step behind. Would-be-reformers need to know that an attempt to access financial services through the internet –something we should each have free and open access to – may be punished solely by virtue of choice of method. Attempts at social mobility may be stymied. You may be barred from economic advancement via an information infrastructure that is arguably yours by birthright. Coercion abides.
What is occurring with digital redlining fits into Moglen’s narrative of a private assault upon commons, two layers deep. In essence, the government and its licensed agents—particularly banks—data harvested within what should be a free knowledge commons is being used to further deny basic tools for economic advancement to historically marginalized groups, entrenching inequalities on multiple fronts.
We must hold the government accountable in order to hold other interests accountable. As Professor Moglen stated:
“What we do have at the moment is the opportunity for a political insistence upon the importance of the commons. We have a great opportunity which lies in the inevitable populist rising of annoyance, then irritation, then anger, at what has happened to the society in which we were all living in relative safety and prosperity only a few years ago. We have an opportunity to explain to people that too much ownership, and too much leverage, and too much exclusivity was the prevailing justification for and also the prevailing reason that what happened, happened.”
Privacy advocates and economic justice advocates are natural allies in this effort, which will surely be an intense moral struggle for the foreseeable future.
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines:
Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list. |
|