Law in the Internet Society

View   r3  >  r2  ...
ValeriaVouterakouFirstEssay 3 - 18 Feb 2025 - Main.ValeriaVouterakou
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Changed:
<
<

The Surveillance Paradox of Digital Empathy

>
>

Surveillance Capitalism: How Humans Became Commodities in the Age of Big Data

 -- By ValeriaVouterakou - 25 Oct 2024
Added:
>
>
In today’s digital world, people are no longer merely individuals, they have metamorphosed into products. Unbeknownst to them, their data gets scooped up, traded and even weaponised by corporations, advertisers, insurance companies and surveillance networks. Surveillance capitalism has turned personal information into a goldmine, fuelling a largely unregulated market where data is extracted, processed and sold, often without ‘real’ consent. This paper looks at how personal data have become a high-value asset in today’s market, influencing aspects such as targeted advertising, financial transactions and healthcare all while operating in a legal sphere that remains opaque and largely underdeveloped. It explores the evolving debate on data ownership as a constitutional right and argues that uncontrolled commodification of personal data is not merely a privacy issue but a fundamental power imbalance that threatens individual autonomy.
 
Changed:
<
<
There is cognitive dissonance between the importance people claim to place on privacy and the manner in which they genuinely uphold it in practice. “This disconnect between attitude and behaviour was given a name, the ‘privacy paradox’", coined originally by Patricia A. Norberg. What seems ironic is how willing people are to trade off ‘privacy’ for new technologies and discounted goods. In “The Myth of the Privacy Paradox”, Daniele J Solove purports to paint a different picture in arguing that this paradox is, in fact, non-existent and that people who do not choose privacy for themselves can still ascribe value to the right to privacy on a principle level. In other words, what is important is the choice of privacy, more so than any decision to trade away personal data. This line of argumentation appears flawed as it overlooks one crucial point, namely that ‘privacy’ is one of the most important fundamental rights that people should have in functioning societies because it is inextricably interwoven with ‘freedom’. To argue that freedom in itself is not important but the choice of freedom is, would be the equivalent of undermining the human condition as you cannot have freedom if you deny your own privacy rights by making superficial and reckless trade-offs. Taking a further look at this surveillance paradox, it becomes increasingly evident that this state of surveillance has also affected our emotional expressions and in turn this has created a false sense of emotional connection, a distorted sense of digital empathy which this paper purports to explore in more detail.
>
>

How surveillance became a business model

 
Changed:
<
<
Taking a closer look at the mechanics of surveillance, we understand that the digital landscape has systematised, unjustifiably so, data collection, emotional data mining, sentiment analysis and behavioural tracking. By allowing the extraction of data that uses algorithms to codify sentiments and emotions with the ulterior motive of targeted marketing, personalised content and monitoring the public sentiment, we have inadvertently created this distorted idea of digital empathy. Digital empathy is a product of two concepts, digital competence and emotional intelligence. In this instance when referring to digital competence what we mean is the ability to use technological platforms in a critical and responsible manner. The same digital platforms that facilitate “emotional” connections are the ones that threaten our privacy rights. This threat creates online behaviour that is the result of the suppression of authentic self-expression. Constant surveillance can inhibit social media users from embracing vulnerability and can lead to reduced authenticity. In some instances, it can also create a pseudo-empathetic display of affection and compassion towards other users (through comments left in the videos shared in social media platforms) which do not hold any true emotional value. This creates a distorted idea of digital empathy. Once this set up changes to anonymous interactions, these platforms morph into breeding grounds for toxicity and hostility (which is evidenced in the comment sections of posts from anonymous users that hide behind their keyboards) and at this exact moment the digital empathy chain breaks . Perhaps this is the greatest paradox of all, how can empathy driven platforms, such as social media, ostensibly encourage connection while simultaneously stifle authenticity and consequently freedom of expression. How can these platforms value emotional transparency and human connection when the very same platforms are a threat to our fundamental right to privacy.
>
>
The big data market is experiencing unprecedented growth with reports estimating that it will reach $401.2 billion's worth by 2028 (n1). Personal data is now one of the world’s most powerful resources with some arguing that it can be characterised as potentially more valuable of a commodity than oil. Tech giants such as Google and Meta have built a business model around the extraction and sale of personal data to advertisers. Google controls nearly 90% of the global search engine market which gives an indication of how powerful the collection of search histories, locations and behavioural purchase patterns are for the digital ad market. Similarly, Meta, through the use of pixels, tracks its user interactions and browsing habits outside of its own platform and utilises this tool to enable targeted ads across various platforms.
 
Added:
>
>
The Cambridge Analytica scandal (2018) exposed the misappropriation and unethical harvesting of Facebook (now Meta) data to create psychographic profiles of voters allegedly swaying electoral outcomes including the 2016 U.S. Presidential election and the Brexit referendum. This scandal highlighted the opaque landscape of data processing and Facebook’s failure to appropriately protect user data. Subsequently, the Federal Trade Commission’s fined Facebook $5 billion and demanded changes to its privacy practices and corporate structure (n2). The transformation of data processing into business models also prompts the question of data ownership. The answer seems to differ depending on the jurisdiction, as the EU’s General Data Protection Regulation (the “GDPR”), retains (at least on principle) robust rights over personal data, whereas the U.S. is met with a lack of comprehensive data privacy law on a federal level and an inadequate patchwork of state-specific regulations (such as the California Consumer Privacy Act (the “CCPA”)) which offers less stringent protections. Overall, the GDPR seems to adopt a ‘shared ownership’ perspective, recognising the rights of individuals over their data after sharing them but not expressly conceding to a full and unconditional ownership of an individual’s data. Notwithstanding the GDPR and CCPA efforts to give individuals more control over their personal data, the underlying assumption of the status quo is that individuals are not inherently entitled to the ownership of their data and the default position is to recognise partial corporate ownership. The surveillance capitalism narrative is more akin to the idea of company ownership (i.e. once users agree to the terms of the companies, the data forms part of the business assets of the company and can be utilised at the company’s discretion).
 
Changed:
<
<
While there are some available options in promoting digital trust and monitoring surveillance, namely by ensuring transparency in data collection, consent mechanisms, anonymisation techniques and enhanced privacy regulations the question that this paradox begs is, is this adequate? I cannot, in my right mind, think of any reason to justify the aforementioned solutions as sufficient to rectify these privacy violations. These mechanisms have always struck me, for lack of a better analogy, as small plasters covering fatally serious wounds, because the very violation of a person’s privacy is a fatal wound to this person’s freedom. Whether a person sees that or not, becomes irrelevant in the face of this violation. Choice, as Daniel J. Solove argued in “The Myth of the Privacy Paradox” also becomes irrelevant. The status quo and the effect it has on the human condition is so much bigger than each individual choice to trade privacy with any other benefit. It becomes imperative that we view this problem as one that needs to be treated universally from its route.
>
>

Should data ownership be a constitutional right?

 
Changed:
<
<
To ensure that emotional transparency is protected and as a result digital empathy is no longer a façade of spurious human connection, we must return back to the question of privacy and surveillance. Only by elevating privacy to the utmost priority, can we ensure that the status quo changes. By implementing mechanisms that prohibit the unrestrained use of surveillance in our society, we can unburden ourselves from the violation of our fundamental right to privacy. This might initially strike the reader as utopian, and perhaps in some ways it is, but a complete surrender to the status quo signals our surrender to the violation of our freedom. Humans cannot truly be free if societies do not preserve the fundamental right to privacy.
>
>
In an era where personal data dictates access to financial opportunities, employment and healthcare insurance, it is imperative to recognise data as an extension of the self, protected under constitutional principles. The Fourth Amendment establishes a fundamental right to privacy by safeguarding individuals from unreasonable searches and seizures. It is not unreasonable to suggest that the collection, processing and monetisation of personal data without the express consent of the individual poses a serious threat to the Fourth Amendment right as it is surveillance without authorisation, akin to trespassing someone’s private property without a search warrant. The Supreme Court recognised the sensitivity around privacy rights in Carpenter v United States (2018) (n3), where the Court ruled that the government’s access to cell phone location data without a warrant constituted a violation of the Fourth Amendment. Applying the same principle, corporations that engage in surveillance capitalism and utilise those business models mentioned above without meaningful consent from their users should be subject to the same standard of scrutiny. If data is to be characterised as an extension of someone’s self (i.e. their property), then it can also be argued that the Fifth Amendment Takings Clause, which prohibits the government from seizing private property without just compensation can create the breeding ground for a paradigm shift. By recognising personal data as a property right, individuals should have control over the data use and be compensated in the event such data is harvested and sold. Similar to intellectual property laws protecting creators from the unauthorised use of their work, data laws could establish that individuals own the digital footprints that they generate and should be compensated for their data. Expanding on this principle, we can even argue that the time individuals spend generating digital footprints should be treated as labour and as such compensated when it becomes a commodity traded in the market from which big corporations then profit. If digital data is treated like personal property of labour, then express consent and compensation become imperative. This would be a result of the recognition that people generate economic value passively through digital footprints and the harvesting and selling of data by big corporations, if not prohibited all together, should at least not be free.
 
Added:
>
>

The body as a data source

 
Changed:
<
<
Given anything that human beings value we can create a "such and such paradox" by noting the difference between what people think they do and what they actually do. Given any "such and such" paradox we can deny it exists by rationalizing it. In all such cases the underlying fallacy is the belief that people have each an undivided personality, and that the manifestations of their multiple personality states are "hypocrisy," "false consciousness," and the like.
>
>
The rise of wearable tech products such as Apple watch, Oura ring and the Whoop have allowed the collation of various body data such as monitoring the heart rate, tracking sleep cycles, and extracting constant streams of biometric data. But what most people fail to understand is that this information does not stay private. Even if it is processed in an anonymised form, the data is still sold to insurance and pharmaceuticals corporations and marketing firms all looking to profit from the most intimate details of the body. Some insurance companies have taken a step further by turning this practice into a financial exchange. They offer discounts in exchange for policyholders wearing fitness trackers. In this instance, the body (through the production of data such as movement, steps, biometrics) is commodified in return for better financial terms in insurance policies.
 
Changed:
<
<
So that portion of this draft which depends on the obtuseness of other writers won't take us very far. And beyond that false dichotomy what we have in the present draft is abstract language with no actual technology, no real law, and no particular politics.
>
>
The acquisition of Fitbit by Google was also a pivotal moment for the health data market as it allowed Google who already controls the vast majority of search and advertising to now gain access to an unprecedent amount of global health data. Aside from antitrust concerns what seems alarming following this acquisition which goes beyond the targeting advertisements consideration, is that Google can collect real-time data on heart rates, sleep patterns and exercise habits, essentially controlling not only a massive number of digital but also physical footprints. The power that the combination of this data holds is such that it raises concerns on ethical practices. The data can be leveraged and sold to health insurance providers which in turn can adjust premiums based on activity levels. In a world where data brokers already profit from personal data nothing will stop a company who holds this much data power from integrating health data into its sprawling network of behavioural profiling and algorithmic predictions.

Conclusion

Surveillance capitalism has turned human existence into an endless data stream which is packaged and sold in ways that the majority of people can barely comprehend. Our internet searches, our movements, heartbeat, sleep cycles, consumer activities are no longer personal experiences. Corporations have turned them into commodities, fuelling an industry worth hundreds of billions and yet the people that are generating this immense value see no meaningful recognition of ownership, control or even profit. This is no longer a discussion about privacy, this involves power. When corporations have a more sophisticated understanding of our habits than we do, when they can predict our choices, manipulate our emotions and determine our financial future what does autonomy then amount to. The grim image of a society where individuals’ data become commodities, traded in a market where there was never really an agreement for such transactions sounds truly dystopic. If the law protects property, freedom of speech and bodily autonomy why should it refuse to protect the right to one’s own digital existence. If we acknowledge that the Constitution is not a static document but the framework evolves with the society, then a reinterpretation of the Constitution through a living constitutionalism lens can help us redefine personal autonomy and the right of a human being to his own data. Until real ownership is granted and people have meaningful control over their data, surveillance capitalism will continue to thrive morphing humans into nothing more than numbers on a corporate ledger.

Endnotes (not included in word count):

n1 - https://www.marketsandmarkets.com/PressReleases/big-data.asp

n2- https://www.ftc.gov/business-guidance/blog/2019/07/ftcs-5-billion-facebook-settlement-record-breaking-and-history-making

n3 - https://www.supremecourt.gov/opinions/17pdf/16-402_h315.pdf

 
Deleted:
<
<
Instead of asking about other people, or "society," how about trying a draft that is about yourself. The persona who authored the current draft thinks that it would be good to reduce the level of surveillance and to increase personal privacy. There are obvious ways for you to do that, personally, immediately. Why not decide which ones you are going to pursue and write about learning how to pursue them?
 



Revision 3r3 - 18 Feb 2025 - 02:43:07 - ValeriaVouterakou
Revision 2r2 - 18 Nov 2024 - 16:31:08 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM