Computers, Privacy & the Constitution
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Bias at the Door: The Case for Banning Facial Recognition in Residential Settings

-- By IdilBarre - 16 Apr 2025

Introduction

Biometric recognition technology uses information gathered by biometric identification systems to verify or identify individuals by matching the biometric data obtained to a database that already contains an individual’s biometric data. A common form of biometric recognition technology (BRT) is facial recognition technology (FRT); FRT uses an individual’s physiological information to verify or identify them by capturing characteristics unique to them such as but not limited to: their facial features, eye color, gait, skin color, and hair color. Landlords across the United States have been installing FRT technology– many of which have been marketed to landlords as a tool to expedite evictions– under the guise of enhancing security for their tenants, but this pernicious technology poses a substantial number of risks to tenants. Such risks include unlawful evictions and arrests, harassment by landlords, heightened surveillance and over-policing of racialized tenants, the degradation of the sanctity of the private lives of individuals, and subjugation to flawed technology.

Background of FRT

Biometric recognition technology (BRT) is the use of a system or process that utilizes biometric information to verify or identify individuals through a number of ways. Biometrics produces automatic recognition and identification of individuals based on their unique physical and behavioral characteristics. One of the most commonly used forms of biometric recognition technology is facial recognition technology (FRT). The mechanics of FRT require a photographic image or frame from a video capturing an individual to then be converted “into a template or a mathematical representation of the photo,” and that template is then compared to another image or still with the use of a matching algorithm that subsequently “calculates their similarity.”

General Concerns of FRT and BRT

While the use of biometric recognition technology is commonly used under the guise of public safety and security purposes by the federal government and law enforcement agencies, a growing number of other uses have emerged– BRT has been deployed by businesses, consumer data collection agencies, banks, schools, landlords, and has entrenched many aspects of public and private life. Critics of BRT have argued that the technology threatens constitutional rights and personal freedoms of individuals such as the right to privacy, freedom of expression, freedom of association, and due process. These privacy concerns focus on the idea that these systems can “quickly, cheaply, and easily ascertain where people have been, who they are with, and what they’re doing, all based on a unique marker,” that cannot easily be changed or obscured– their face.

Bias, error rates, and racism

Facial recognition algorithms have been found to be systemically biased against those with darker skin, providing more false positives than for those with lighter skin. In a 2018 groundbreaking study done by the Gender Shade’s Project, researchers published an audit on the accuracy of major facial recognition algorithms. They were able to evaluate the accuracy rates of FRT algorithms and found that the majority of existing facial recognition algorithms showed inaccuracy rates of darker-skinned females to be 34 percent yet only 0.8 percent for “lighter skinned males.”

Bias found in these systems can be largely attributed to the biases exhibited by developers of these systems: “When coding, humans are likely to reflect their own priorities, preferences, and prejudices.” While this alone is a problem hard enough to overcome, it is complicated by the fact that “algorithms are often written by homogeneous developers…namely white men.”

Racial biases in biometric recognition not only exist with regards to facial recognition, but voice recognition as well, another commonly used form of BRT. A 2020 study by the Proceedings of the National Academy of Sciences (PNAS) showed that voice recognition technology from companies like Apple and Microsoft were twice as likely to incorrectly transcribe audio from Black speakers.

Current use by landlords

Facial recognition technology is particularly insidious in the housing and residential context as people are impacted in their homes, “a sacred and legally protected space where people are most entitled to enjoy their privacy rights and be free from intrusion.” Despite their concerns, residents of Knickerbocker Village are automatically subjected to faulty surveillance the moment they step outside of their homes.

A 2022 report by the Anti-Eviction Mapping Project found that thousands of residential buildings in NYC presently utilize virtual doormen, facial recognition, and keyless technologies, including products from companies who advertise their products’ ability to help expedite evictions. Information extracted from these systems can be used to catch alleged lease breaking behavior or used to report alleged criminal activity, allowing landlords the opportunity to evict tenants in favor of a more desirable tenant.

In 2019, rent-stabilized tenants at affordable housing complex Atlantic Plaza Towers in Brooklyn successfully filed a complaint with New York State’s Division of Housing and Community Renewal (DHCR) to stop their property owner from installing facial recognition software. Residents also described the “chilling effect that the video surveillance had on tenant participation in the tenant association and community activities.”

Conclusion

FRT remains a pernicious use of technology both in the private and public context, but the use of FRT by landlords poses particularly egregious and unjustifiable risks to tenants. FRT is an unreliable form of technology with error rates disproportionately affecting the very people that are targeted with and subject to this invasive technology– low income, people of color. FRT has proven to not only be inaccurate, but has led to false arrests, unlawful detainments, and generally misidentifications of individuals. Evidence fails to show that the use of FRT actually enhances security for communities under the guise of enhancing security for their tenants, but this pernicious technology poses a substantial number of risks to tenants.

Although potential remedies are available under existing law, such as the Fair Housing Act, they are not enough to protect tenants from the discriminatory technology that is FRT. Even with its permittivity for disparate impact claims, the FHA remains outdated and has failed to meet the demands in the ever changing technological landscape that continues to grow and evolve.

The current framework that exists to protect tenants wanting to bring disparate impact claims on the basis of FRT is insufficient as the heightened standard required makes it unlikely for these claims to withstand the robust causality inquiry. In light of this, the FHA must be amended to entirely prohibit facial recognition technology in the housing context.

Navigation

Webs Webs

r2 - 23 Apr 2025 - 18:25:20 - IdilBarre
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM