Law in the Internet Society

View   r32  >  r31  >  r30  >  r29  >  r28  >  r27  ...
TWikiGuestFirstEssay 32 - 13 Oct 2023 - Main.LudovicoColetti
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Changed:
<
<
We post in our virtual canvas, thinking that we purposely and consciously decide the content of those posts, but our mental capacity does not drive those actions. The parasite induces them. The parasite give us tools to share, to edit pictures, and to post, but the underlying reality is that we post the only thing we own: our time.
>
>
In the dystopian world of the TV show "Black Mirror," the episode "Nosedive" describes a world where social media ratings determine one’s socioeconomic status and access to essential services. Using a mobile application, everyone constantly rates others on a five-point scale. Those with higher scores can access to better services and exclusive clubs, while those with low scores are penalized in many ways. While this may seem like a far-fetched fiction, the reality of today may be not too distant from this portrayal.
 
Changed:
<
<
Nonetheless, our time is relative. We are drained of freedom of decision every time we click, swipe, and accept terms and conditions from the “free” services we use on the internet. I want to emphasize the quotations marks that I placed besides free. The latter, as we tend to think that we decide over what we post and with whom we share and interact. The reality is different. We lie to ourselves when we say that we control our virtual personalities.
>
>
The first example that comes to mind is China’s Social Credit System (SCS), developed between 2014 and 2020. The SCS uses artificial intelligence "to develop comprehensive data-driven structures for management around algorithms that can produce real time reward-punishment structures for social-legal-economic and other behaviors" (Larry Cata Backer, Next generation law: data-driven governance and accountability-based regulatory systems in the west, and social credit regimes in China, 2018). The SCS in reality does not actually rely on a universal score but rather on a series of blacklists and redlists managed at different levels (municipal, local, or national). Each authority can manage its own blacklist (e.g., on those who failed to pay fines or child support) and they all converge into the National Credit Information Sharing Platform. As mentioned by Kevin Werbach in his 2022 article “Orwell that ends well? Social credit as regulation for the algorithmic age”, this makes possible that "grade A taxpayers receive customs fee waivers and low-interest loans, in addition to the "home" benefits offered by the tax collection authority". Prof. Werbach however believes that western's depiction of the SCS is is exaggeratedly negative, especially in a world where governments and corporations are extensively tracking our behavior. He sees the Nosedive scenario as more resembling to the ratings system on Uber or eBay, expanded beyond the boundaries of one service.
 
Changed:
<
<
The parasite is the only controlling who we are. We feed the parasite with our posts - even when we overthink their content-. We give the parasite control over our capacity to decide. The parasite knows our steps, knows our fertility cycle (it even predicts it), knows our sleep cycle, and suggests what to eat, buy, and like. All these “suggestions” are inductions.

I have realized that even when we reflect on our virtual accounts, controlling and limiting our virtual content Is not enough. In other words, we waste our time trying to curate the life we want to share. We are not curating or deciding. In the end, it is the parasite that grows. It is asfixitiang roots over our brains.

We are not curating for those who benefit from our engagement (social media platforms, stores, advertisement) or those who follow us and want a glimpse of our life. The paradox here is that every time we feed my virtual profile, we deprive ourselves of the ability to keep things private. And with this, once again, making bigger and stronger parasites.

I ask myself: What is the purpose of keeping things private? And my answer is that privacy buys me time to reflect, think and create. Privacy protects how the piece of information about you has been obtained. Marmor's words: "it is about the how, not the what, that is known about you." The latter, as "our ability to control [how] we present ourselves to others is inherently limited."

From that stance, privacy gives us time, and therefore protects time, on when to disclose or reveal something. The underlying issue with the parasite is that it is the curator of our profiles, and in that filtration journey that we gave rise to, we have lost our ability to choose how others are using our time (and life).

Feeding the profiles consumes time. We post because the likes, comments, and virtual interactions affirm or reinforce the virtual being we choose to share with our selected community. We believe we have control over what and who we share. Still, the reality is that every click diminishes freedom, extinguishes privacy, and deprives us of the only thing we own: time.

The outcome is our inability to reflect and pause because our consciousness of time is limited by immediacy, neediness, and over-exposure. And the worst part is that the idea of being infinite humans, in the microcosmic stance, is vanished by the constant of self-reinvention instead of self-expansion.

The parasite is playing the game of “letting us choose.” What we have to realize is that every choice makes the parasite stronger. The parasite is using us to increase profits, is triggering our decisions in the way that serves the parasite’s ends. The verbs to share and post, which are the core of the interactions on the platforms, withered integration, insertion, and social construction. We handed our privacy in exchange for a fake sense of control. We gave our time, memories, and the idea of integrity in exchange for a false self-made identity that lacks authenticity and freedom—a self-imposed view by the parasite.

Humanity has dealt with the eternity/infinity question since we articulate ideas. To overcome the fact that our nature is limited by time, people used to write, paint, have children, and teach. By switching the idea of “eternity” towards platforms that hold and “save” our memories, our approach to eternity is rotten. To lay down this, I want to recall when Don Quixote found out, in his conversation with Sansón Carrasco, that his adventures were a topic of discussion among the students at the University of Salamanca. For him, being public, discussed, and remembered was an outcome, not a decisive purpose. He didn't act to be a topic. By the course of his actions, he became a character and, as a result, a subject of discussion. The lions, the windmills, and the galley slaves' adventures were public, and some read those actions as insanity, others as geniality.

Our desire for control shows that our aim to be remembered is vague because we rely solely on feeding the parasite. If we aim to change this reality, we need to cut ties with the parasite. Disable our social media accounts, the trackings the apps have over our lives. We need to stop posting on those platforms that instrumentalize our time to deprive us of individuality.

>
>
As noted by Yuval Noah Harari, free-market capitalism and state-controlled communism can be regarded as distinct data processing systems: the former is decentralized and the latter is centralized. It shouldn't come as a surprise then that western's versions of social credit experiare being made mainly by private corporations, especially in the financial sector.

TWikiGuestFirstEssay 31 - 10 Jan 2022 - Main.NataliaNegret
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Changed:
<
<
>
>
We post in our virtual canvas, thinking that we purposely and consciously decide the content of those posts, but our mental capacity does not drive those actions. The parasite induces them. The parasite give us tools to share, to edit pictures, and to post, but the underlying reality is that we post the only thing we own: our time.
 
Added:
>
>
Nonetheless, our time is relative. We are drained of freedom of decision every time we click, swipe, and accept terms and conditions from the “free” services we use on the internet. I want to emphasize the quotations marks that I placed besides free. The latter, as we tend to think that we decide over what we post and with whom we share and interact. The reality is different. We lie to ourselves when we say that we control our virtual personalities.
 
Added:
>
>
The parasite is the only controlling who we are. We feed the parasite with our posts - even when we overthink their content-. We give the parasite control over our capacity to decide. The parasite knows our steps, knows our fertility cycle (it even predicts it), knows our sleep cycle, and suggests what to eat, buy, and like. All these “suggestions” are inductions.
 
Changed:
<
<

A Growing Need to Protect Privacy in an Era of Growing Willingness to Give it Up

>
>
I have realized that even when we reflect on our virtual accounts, controlling and limiting our virtual content Is not enough. In other words, we waste our time trying to curate the life we want to share. We are not curating or deciding. In the end, it is the parasite that grows. It is asfixitiang roots over our brains.
 
Added:
>
>
We are not curating for those who benefit from our engagement (social media platforms, stores, advertisement) or those who follow us and want a glimpse of our life. The paradox here is that every time we feed my virtual profile, we deprive ourselves of the ability to keep things private. And with this, once again, making bigger and stronger parasites.
 
Changed:
<
<

The Advent of Privacy Challenges

>
>
I ask myself: What is the purpose of keeping things private? And my answer is that privacy buys me time to reflect, think and create. Privacy protects how the piece of information about you has been obtained. Marmor's words: "it is about the how, not the what, that is known about you." The latter, as "our ability to control [how] we present ourselves to others is inherently limited."
 
Changed:
<
<
Those of us born in the 90s remember the in-between; the shift of people carrying cellphones, to people carrying cellphones that could connect to the internet. Of one being able to use a bulky computer in a stationary place, to carrying around a laptop that let us take our work anywhere. To the only “social” being face-to-face meetings, to social being a word that finds its place before “media.”
>
>
From that stance, privacy gives us time, and therefore protects time, on when to disclose or reveal something. The underlying issue with the parasite is that it is the curator of our profiles, and in that filtration journey that we gave rise to, we have lost our ability to choose how others are using our time (and life).
 
Changed:
<
<
We look at our current debates with privacy and think, “this is because of the internet revolution.” But in fact, right to privacy is alluded to from the very advent of our nation. The U.S. Constitution, as interpreted by the Supreme Court, recognizes a right to privacy in multiple amendments. Further, the first article addressing the privacy was by Justice Louis Brandeis in his 1890 Harvard Law Review article, stemming from the advent of photography and newspaper invasion into individuals’ homes. 1948 Saw the U.N. Declaration of Human Rights address privacy, and soon after in 1960, legal Scholar William Prosser “outlined four torts that would allow someone whose privacy was violated…to sue the perpetrator for damages.” (1)
>
>
Feeding the profiles consumes time. We post because the likes, comments, and virtual interactions affirm or reinforce the virtual being we choose to share with our selected community. We believe we have control over what and who we share. Still, the reality is that every click diminishes freedom, extinguishes privacy, and deprives us of the only thing we own: time.
 
Added:
>
>
The outcome is our inability to reflect and pause because our consciousness of time is limited by immediacy, neediness, and over-exposure. And the worst part is that the idea of being infinite humans, in the microcosmic stance, is vanished by the constant of self-reinvention instead of self-expansion.
 
Added:
>
>
The parasite is playing the game of “letting us choose.” What we have to realize is that every choice makes the parasite stronger. The parasite is using us to increase profits, is triggering our decisions in the way that serves the parasite’s ends. The verbs to share and post, which are the core of the interactions on the platforms, withered integration, insertion, and social construction. We handed our privacy in exchange for a fake sense of control. We gave our time, memories, and the idea of integrity in exchange for a false self-made identity that lacks authenticity and freedom—a self-imposed view by the parasite.
 
Added:
>
>
Humanity has dealt with the eternity/infinity question since we articulate ideas. To overcome the fact that our nature is limited by time, people used to write, paint, have children, and teach. By switching the idea of “eternity” towards platforms that hold and “save” our memories, our approach to eternity is rotten. To lay down this, I want to recall when Don Quixote found out, in his conversation with Sansón Carrasco, that his adventures were a topic of discussion among the students at the University of Salamanca. For him, being public, discussed, and remembered was an outcome, not a decisive purpose. He didn't act to be a topic. By the course of his actions, he became a character and, as a result, a subject of discussion. The lions, the windmills, and the galley slaves' adventures were public, and some read those actions as insanity, others as geniality.
 
Changed:
<
<

The Modern Issues

In the past, such concerns were largely driven by individuals not having control over the actions of others—of the press taking photos, of the government invading their homes. However, in today’s age the concern is individuals’ own ignorance or willingness to forgo privacy for service. In an era of programmatic, targeted advertising, it’s easy to give up our names, ages, emails, and phone numbers, for the convenience and range of services that make life easier, often with the added allure of such services being free.

Earlier this month, former Facebook employee France Haugen released files revealing the results of the company’s internal research results regarding the impact of Instagram on teenage girls. A key statistic that has been highlighted in the media is that “32 perfect of teen girls said that when they felt bad about their bodies, Instagram made them feel worse” (2). One solution addresses that children under thirteen aren’t even supposed to be making accounts, because data collection on children under that age goes against our country’s privacy laws. Yet, I know many of my classmates signed up for Facebook before they were thirteen with fake birthdays. Facebook also mentioned a potential to create “Instagram Kids.”

Similarly, humans invariably offer up their data. Sometimes due simply to being unaware of what they’re revealing by doing so (as with the military base that was revealed when soldiers decided to compete with each other, uploading their fitness tracker data in the process and creating a map of their exercise route). In other ways, we do so for convenience, as with the FreeStyle? Libre sensors that have been using AI to recommend personalized diets based on individual’s glucose levels (4).

Attempts at Solving The Issue

Apple created a lot of buzz (and some very creative advertising campaigns) when they released a pop-up window that notifies users that an app is tracking their data, allowing users to prevent the app from doing so. (3) Many small businesses and apps were upset by the change, arguing that this was how they allowed users to access their services for free. Facebook responded saying that it was attempting to create a method of advertising that doesn’t rely on user data (3). But is it really that easy to dismantle a $350 billion digital industry? These companies have different views of how much they should roll back such advertising.

While BigTech? attempts to revamp their own privacy systems, can and should users do more to take privacy into their own hands? I’m positive that many people would rather use an app for free than pay to remove advertising (as evidences by the numerous app-store complaints when apps roll out pay-for-no-ads versions of their products). There has been a growing industry of products that market themselves as shirking ads (for example Brave, the private web browser), but how many people choose to use this service?

Furthermore, what is the state of media literacy in our country? One of the first ways we can protect young children who will undeniably sign up for these enticing social media services is to inform them about what they give up in exchange for access to endless streams of videos, 150-word posts, and their friends’ photos.

In the long run, I would argue that this education is a must if we’re to convince people to pay for subscription fees in lieu of paying for such services with their data.

(1) https://safecomputing.umich.edu/privacy/history-of-privacy-timeline

(2) https://www.nytimes.com/2021/10/13/parenting/instagram-teen-girls-body-image.html

(3) https://www.nytimes.com/2021/09/16/technology/digital-privacy.html

(4) https://www.theguardian.com/lifeandstyle/2021/oct/05/intimate-data-can-a-person-who-tracks-their-steps-sleep-and-food-ever-truly-be-free

>
>
Our desire for control shows that our aim to be remembered is vague because we rely solely on feeding the parasite. If we aim to change this reality, we need to cut ties with the parasite. Disable our social media accounts, the trackings the apps have over our lives. We need to stop posting on those platforms that instrumentalize our time to deprive us of individuality.

TWikiGuestFirstEssay 30 - 01 Nov 2021 - Main.RochishaTogare
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"

Changed:
<
<

>
>

A Growing Need to Protect Privacy in an Era of Growing Willingness to Give it Up

 
Changed:
<
<

:

>
>

The Advent of Privacy Challenges

 
Added:
>
>
Those of us born in the 90s remember the in-between; the shift of people carrying cellphones, to people carrying cellphones that could connect to the internet. Of one being able to use a bulky computer in a stationary place, to carrying around a laptop that let us take our work anywhere. To the only “social” being face-to-face meetings, to social being a word that finds its place before “media.”
 
Added:
>
>
We look at our current debates with privacy and think, “this is because of the internet revolution.” But in fact, right to privacy is alluded to from the very advent of our nation. The U.S. Constitution, as interpreted by the Supreme Court, recognizes a right to privacy in multiple amendments. Further, the first article addressing the privacy was by Justice Louis Brandeis in his 1890 Harvard Law Review article, stemming from the advent of photography and newspaper invasion into individuals’ homes. 1948 Saw the U.N. Declaration of Human Rights address privacy, and soon after in 1960, legal Scholar William Prosser “outlined four torts that would allow someone whose privacy was violated…to sue the perpetrator for damages.” (1)
 
Deleted:
<
<

 
Added:
>
>

The Modern Issues

 
Added:
>
>
In the past, such concerns were largely driven by individuals not having control over the actions of others—of the press taking photos, of the government invading their homes. However, in today’s age the concern is individuals’ own ignorance or willingness to forgo privacy for service. In an era of programmatic, targeted advertising, it’s easy to give up our names, ages, emails, and phone numbers, for the convenience and range of services that make life easier, often with the added allure of such services being free.

Earlier this month, former Facebook employee France Haugen released files revealing the results of the company’s internal research results regarding the impact of Instagram on teenage girls. A key statistic that has been highlighted in the media is that “32 perfect of teen girls said that when they felt bad about their bodies, Instagram made them feel worse” (2). One solution addresses that children under thirteen aren’t even supposed to be making accounts, because data collection on children under that age goes against our country’s privacy laws. Yet, I know many of my classmates signed up for Facebook before they were thirteen with fake birthdays. Facebook also mentioned a potential to create “Instagram Kids.”

Similarly, humans invariably offer up their data. Sometimes due simply to being unaware of what they’re revealing by doing so (as with the military base that was revealed when soldiers decided to compete with each other, uploading their fitness tracker data in the process and creating a map of their exercise route). In other ways, we do so for convenience, as with the FreeStyle? Libre sensors that have been using AI to recommend personalized diets based on individual’s glucose levels (4).

Attempts at Solving The Issue

Apple created a lot of buzz (and some very creative advertising campaigns) when they released a pop-up window that notifies users that an app is tracking their data, allowing users to prevent the app from doing so. (3) Many small businesses and apps were upset by the change, arguing that this was how they allowed users to access their services for free. Facebook responded saying that it was attempting to create a method of advertising that doesn’t rely on user data (3). But is it really that easy to dismantle a $350 billion digital industry? These companies have different views of how much they should roll back such advertising.

While BigTech? attempts to revamp their own privacy systems, can and should users do more to take privacy into their own hands? I’m positive that many people would rather use an app for free than pay to remove advertising (as evidences by the numerous app-store complaints when apps roll out pay-for-no-ads versions of their products). There has been a growing industry of products that market themselves as shirking ads (for example Brave, the private web browser), but how many people choose to use this service?

Furthermore, what is the state of media literacy in our country? One of the first ways we can protect young children who will undeniably sign up for these enticing social media services is to inform them about what they give up in exchange for access to endless streams of videos, 150-word posts, and their friends’ photos.

In the long run, I would argue that this education is a must if we’re to convince people to pay for subscription fees in lieu of paying for such services with their data.

(1) https://safecomputing.umich.edu/privacy/history-of-privacy-timeline

(2) https://www.nytimes.com/2021/10/13/parenting/instagram-teen-girls-body-image.html

(3) https://www.nytimes.com/2021/09/16/technology/digital-privacy.html

(4) https://www.theguardian.com/lifeandstyle/2021/oct/05/intimate-data-can-a-person-who-tracks-their-steps-sleep-and-food-ever-truly-be-free

  \ No newline at end of file

TWikiGuestFirstEssay 29 - 26 Oct 2021 - Main.KatharinaRogosch
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Deleted:
<
<
A Growing Concern For Privacy
 
Deleted:
<
<

What

 
Added:
>
>

 
Deleted:
<
<
In the modern world of technology, where internet mammoths such as Google and Facebook, collect large amounts of personal data, the regulation of the collection of such data is essential. The interconnected relationship between data and individuals’ privacy over their own data needs to be examined to understand whether the current framework can achieve its own aims. This requires a two-set analysis: first, an examination of the regulation of data privacy and whether the standards imposed actually result in said protection; and secondly, an evaluation as to whether privacy should be protected by other means that it currently is.
 
Changed:
<
<
To aid in this analysis, the European General Data Protection Regulation (hereinafter “GDPR”) will be examined. This is due to the fact that it is one of the strictest data protection laws enacted worldwide, and an examination of such a strict privacy and data-protection standard should provide clarity as to whether adequate privacy protections have been achieved.
>
>

:

 
Deleted:
<
<

General Data Protection Regulation:

 
Deleted:
<
<
Within the European Union data protection is secured and regulated by the General Data Protection Regulation. The GDPR aims “to give citizens and residents of the European Union and the European Economic Area control over their personal data, and to simplify the regulatory environment for international business by fully harmonizing the national data laws of its member states”. However, the GDPR does not only concern privacy, rather its objectives relate to the “fundamental rights and freedoms of natural persons” surrounding the “processing and free movement of personal data”. Consequently, the GDPR also aims to address the rising power of Big Data practices and the “economic imbalance between [these companies] on one hand and consumers on the other”.
 
Changed:
<
<
The GDPR addresses the power imbalance between data controllers, who derive significant commercial benefit from the use of data, and users who bear significant harms associated with the usage of their own data. The legislation does this by placing explicit consent and anonymization techniques at the core of data processing. However, by focusing on these two specific aspects, the European legislators construct “structural regulatory spaces that fail to understand the ongoing challenges in delivering acceptable and effective regulation”. By exclusively concentrating on consent and anonymization techniques as a way to ensure data privacy and security, the GDPR fails to address not only the issues these concepts create but also how these should be implemented by app developers.
>
>

 
Deleted:
<
<
There are two issues created by the GDPR regulation, and that consequently significantly affect individual users’ privacy and data. Firstly, by using individuals’ consent as the gatekeeper to the legal processing of data, the GDPR places heavy emphasis on internet platforms themselves to fulfill the necessary GDPR standards. While simply obtaining users’ consent to the processing of their personal data does not make the processing of such data lawful, the fact that it is up to internet organizations themselves to implement adequate privacy standards says very little in terms of the protection that such standards afford in reality. Secondly, the GDPR stipulates that when data is anonymized, the need for explicit consent of the processing of the collected data is no longer required. At its core, by placing emphasis on anonymization techniques, the GDPR aims to reduce harmful forms of identification by preventing the singling out of natural persons and their personal information. However, as Narayanan and Shmitikov’s Paper on De-anonymization of Large Datasets and Oberhauses’s article on anonymous browsing data underline, de-anonymization of large data sets is standard industry practice for a majority of internet platforms.

Is the GDPR the right standard for privacy protection?

As outlined above, there are several issues associated with using the GDPR as the standard for privacy protection, the two biggest ones being treating consent as the standard for privacy, and the ability to de-anonymize data. Despite these issues, there are a number of benefits associated with using GDPR as the standard for data protection, namely that it functions in what Profesor Moglen as part of his “The Union, May it Be Preserved” speech in a transactional sphere. While Professor Moglen sees this as a problematic quality of the GDPR, the fact that the GDPR functions as a transaction where users consent to collection and usage of their data as a “transaction” for which they receive the benefit of accessing internet platforms means that the regulation can easily be implemented by any type of corporation. The issue with the GDPR is that the standards of implementation are too lax, and upon drafting the GDPR in 2018 the impact of de-anonymization technologies was not sufficiently considered. One could argue that if amendments were implemented into the GDPR that would tackle the issues of de-anonymization technologies the current privacy issues would be adequately addressed. However, such an argument fails to address the fundamental power imbalance created by internet platforms such as Google, Yahoo, and Facebook, where individual users are not given a choice as to how their data is processed.

Instead of working within the confines of the GDPR as it exists currently, Professor Moglen argues that we need to challenge our basic assumption that privacy and our data is part of the “transaction”. To some extent this idea has merit, in that why should our own personal data be a transactional token by which our privacy is achieved? In this sense, Professor Moglen’s definition of privacy as “ecological” and “relational among people” rather than an issue of individual consent is one that seems to provide a stricter standard of privacy protection. While an ecological conception of privacy could provide a much stricter standard of individuals’ data protection, the means of achieving such protection are less concrete. Namely, what standard of privacy is going to be the baseline to which all protection is measured (if an ecological protection of privacy is adopted akin to environmental protection)?

 

\ No newline at end of file


TWikiGuestFirstEssay 28 - 25 Oct 2021 - Main.RochishaTogare
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Added:
>
>
A Growing Concern For Privacy
 
Changed:
<
<

Does the GDPR adequately protect individuals' privacy?

>
>

What

 

In the modern world of technology, where internet mammoths such as Google and Facebook, collect large amounts of personal data, the regulation of the collection of such data is essential. The interconnected relationship between data and individuals’ privacy over their own data needs to be examined to understand whether the current framework can achieve its own aims. This requires a two-set analysis: first, an examination of the regulation of data privacy and whether the standards imposed actually result in said protection; and secondly, an evaluation as to whether privacy should be protected by other means that it currently is.


Revision 32r32 - 13 Oct 2023 - 01:14:33 - LudovicoColetti
Revision 31r31 - 10 Jan 2022 - 04:15:10 - NataliaNegret
Revision 30r30 - 01 Nov 2021 - 09:57:43 - RochishaTogare
Revision 29r29 - 26 Oct 2021 - 04:50:01 - KatharinaRogosch
Revision 28r28 - 25 Oct 2021 - 20:57:22 - RochishaTogare
Revision 27r27 - 23 Oct 2021 - 00:22:22 - KatharinaRogosch
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM