RohanGeorge1FirstPaper 2 - 10 Jun 2018 - Main.EbenMoglen
|
|
META TOPICPARENT | name="FirstPaper" |
| |
< < | | | Regulating large-scale Behavioral Collection, ver 2.0
-- By RohanGeorge1 - 21 Apr 2018 | | Different Services from the Same Providers - Drawing Users Rights from the 'Free Software Freedoms'
In fact, regulators should consider a switch from data protection laws premised on consent to laws that enumerate what rights users should have. It strikes me that some of the freedoms that ensure users control of software can be applied to a law enumerating user’s rights with respect to internet services. For example, the freedom to run programs as the users wish (freedom 0), in this context, should refer to the user’s right to control how their data is used. This does not mean merely consenting, but rather requiring the design of services and software so that users actually can decide whether they want to see any advertising, decide which kinds of third parties they would like to share their information with and a right to change these settings as the user wishes. This could also include a right not to be discriminated against by algorithms or simply a right not to have one’s data collected at all. This sort of control is radically more control than the current privacy control that Mark Zuckerberg suggested all users have. | |
> > |
Improvement here is a matter of focus. We can remove some
extraneous material and shorten the presentation of the problem:
"Cambridge Analytica and the resulting furore over the data-sharing
behavior of Facebook will now allow more daring thinking than
before," is the point of departure, and you can put it succinctly.
Free software is not directly relevant, as your text makes clear: in
the economy of services, a concept of "users' rights" is not a
modification of copyright law, but a form of economic regulation at
a deeper structural level. So the definition of those rights isn't
an "it strikes me" matter, but really the heart of the enterprise.
What rights should users possess in the digital services economy?
Defining the rights and explaining the mechanism of their
enforcement is the primary subject. You cannot get everything
explained in detail in 1,000 words, but you can present the core of
your thinking as a more or less finished product.
| |
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. |
|
RohanGeorge1FirstPaper 1 - 21 Apr 2018 - Main.RohanGeorge1
|
|
> > |
META TOPICPARENT | name="FirstPaper" |
Regulating large-scale Behavioral Collection, ver 2.0
-- By RohanGeorge1 - 21 Apr 2018
Introduction
Last term, I wrote an essay suggesting that a more immediate solution was needed to regulate behavioral collection. These short-term solutions, use restrictions and privacy by design, were proposed because I argued that “there is no political will in the ‘West’ to challenge the dominion of the platform companies in the near future”.
The Cambridge Analytica revelations have forced me to reconsider the above argument. It is now an opportune moment to capitalize on the newfound currency of privacy issues and political will to consider regulation of the platform companies in order to remedy the problem of extensive behavioral collection.
While my previous piece described some aspects of current legal failures to protect individual freedom of thought, this essay will first focus on the particular challenges to using existing constitutional provisions to protect privacy and freedom of thought. Second, this essay will consider a structural proposal that addresses freedom of thought directly, through adopting the philosophy behind free software.
Current Constitutional Limitations to regulating the behavioral collection
Neither the Fourth nor First Amendments currently protect privacy sufficiently.
For the IV Amendment, its main protections are against unreasonable searches and seizures of places; in the 21st century, it is identities that are searched. One’s digital footprint, a collection of browsing patterns, pages visited, liked and shared, content on one’s device, are not places; yet they are often searched by police. However, the text of the IV Amendment, interpreted using any number of theories from originalist to formalist to functionalist, does not seem to cover identities. Moreover, one’s reasonable expectation of privacy is surrendered by sharing information with a third party happens whenever we consent to data collection by using mobile apps or services, because our identities are now stored with third parties instead of on our person or in our houses, papers or effects. Consent opens the gateway to extensive behavioral data gathering. Therefore, it has been demonstrated how the IV amendment currently affords us little privacy protection.
For the I amendment, the issue is slightly different. It is important to locate the relationship between privacy and freedom of expression in the idea of freedom to read, learn and consume content without monitoring. While there are strong protections for anonymous political speech and freedom of association, these protections have not been explicitly extended to the freedom to read and to receive knowledge. In other words, the freedom to read without extensive monitoring of content has already been threatened by the large-scale behavioral data collection and the I Amendment has not protected the space between the book and the reader at all.
We Need Different Services or Different Service Providers
Different Services
In exploring further the idea about the need for different services, I found myself finally able to see the conceptual link between free software and behavioral tracking. In particular, I realized the relationship between the question about whether software is free vs proprietary and the extent of behavioral data collection designed into the service.
When software is proprietary or distributed as Service as a Software Substitute, or simply a networked service that forces users to use non-free software, the software is controlled by companies that are profit-maximizing. In an age where targeted digital advertising is a massive revenue-generating machine, it is often the case that proprietary software is designed to monitor or track user activity because doing so allows these companies to sell users as the product to advertisers. As described, there is a clear relationship between non-free software and the incentive to design software that facilitates behavioral data collection for advertising.
By contrast, the essence of free software, it seems to me, is to ensure that users of software or internet services, can control the software they use. In addition to the important idea that free software lets individuals customize their computing, there is the added import that free software does not have the profit motive to control the design of the program or service because the source code is released (instead of copyrighted and sold).
The effect of the difference in incentives leads to free software programs without the malware that may monitor, track or otherwise restrict users. Hence, a dramatic improvement of the privacy situation of individuals will be likely if users switched to free software that provided them the same services without the middleman profiting from the provision of computing power that is increasingly cheap and easy to provide in the 21st century. Free software will allow for reading anonymously and freedom from behavioral advertising that currently does not exist in the world of Facebook or Google.
However, the question then becomes how to mandate such adoption of free software. Here, any direct regulatory attempt to mandate free software service providers will be met with the full deep-pocketed resistance of the world’s most valuable companies. But, this does not mean applying the philosophy of free software is foreclosed to regulators.
Different Services from the Same Providers - Drawing Users Rights from the 'Free Software Freedoms'
In fact, regulators should consider a switch from data protection laws premised on consent to laws that enumerate what rights users should have. It strikes me that some of the freedoms that ensure users control of software can be applied to a law enumerating user’s rights with respect to internet services. For example, the freedom to run programs as the users wish (freedom 0), in this context, should refer to the user’s right to control how their data is used. This does not mean merely consenting, but rather requiring the design of services and software so that users actually can decide whether they want to see any advertising, decide which kinds of third parties they would like to share their information with and a right to change these settings as the user wishes. This could also include a right not to be discriminated against by algorithms or simply a right not to have one’s data collected at all. This sort of control is radically more control than the current privacy control that Mark Zuckerberg suggested all users have.
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines:
Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list. |
|
|
|
This site is powered by the TWiki collaboration platform. All material on this collaboration platform is the property of the contributing authors. All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
|
|