RohanGeorge1FirstPaper 3 - 17 Jun 2018 - Main.RohanGeorge1
|
|
META TOPICPARENT | name="FirstPaper" |
| |
< < | Regulating large-scale Behavioral Collection, ver 2.0 | > > | Rgulating large-scale Behavioral Collection, ver 2.0 [amended] | | | |
< < | -- By RohanGeorge1 - 21 Apr 2018 | > > | -- By RohanGeorge1 - 16 June 2018 | |
Introduction | |
< < | Last term, I wrote an essay suggesting that a more immediate solution was needed to regulate behavioral collection. These short-term solutions, use restrictions and privacy by design, were proposed because I argued that “there is no political will in the ‘West’ to challenge the dominion of the platform companies in the near future”. | > > | Last term, I wrote an essay suggesting that short-term solutions like use restrictions and privacy by design should be considered important stop-gap measures to regulate widespread behavioral collection. The Cambridge Analytica revelations made me realize that they were just stop-gap measures, covering the underlying gap between current privacy laws and the problem of behavior collection. It is now an opportune moment to directly tackle this problem. | | | |
< < | The Cambridge Analytica revelations have forced me to reconsider the above argument. It is now an opportune moment to capitalize on the newfound currency of privacy issues and political will to consider regulation of the platform companies in order to remedy the problem of extensive behavioral collection. | > > | I will first outline why existing constitutional provisions cannot adequately constrain behavioral collection or protect freedom of thought. Then, I will propose a more structural regulatory alternative – considering what rights users should possess in the digital services economy. | | | |
< < | While my previous piece described some aspects of current legal failures to protect individual freedom of thought, this essay will first focus on the particular challenges to using existing constitutional provisions to protect privacy and freedom of thought. Second, this essay will consider a structural proposal that addresses freedom of thought directly, through adopting the philosophy behind free software. | | Current Constitutional Limitations to regulating the behavioral collection
Neither the Fourth nor First Amendments currently protect privacy sufficiently. | |
< < | For the IV Amendment, its main protections are against unreasonable searches and seizures of places; in the 21st century, it is identities that are searched. One’s digital footprint, a collection of browsing patterns, pages visited, liked and shared, content on one’s device, are not places; yet they are often searched by police. However, the text of the IV Amendment, interpreted using any number of theories from originalist to formalist to functionalist, does not seem to cover identities. Moreover, one’s reasonable expectation of privacy is surrendered by sharing information with a third party happens whenever we consent to data collection by using mobile apps or services, because our identities are now stored with third parties instead of on our person or in our houses, papers or effects. Consent opens the gateway to extensive behavioral data gathering. Therefore, it has been demonstrated how the IV amendment currently affords us little privacy protection. | > > | For the IV Amendment, its main protections are against unreasonable searches and seizures of places; in the 21st century, it is identities that are searched. One’s digital footprint is not a place, yet it is often searched by police. Unfortunately, the IV Amendment itself, interpreted using any particular interpretative theory, does not seem to cover identities. Moreover, one’s reasonable expectation of privacy is surrendered by sharing information with a third party happens whenever we consent to data collection by using mobile apps or services, because our identities are now stored with third parties instead of on our person or in our houses, papers or effects. Consent opens the gateway to extensive behavioral data gathering. Therefore, it has been demonstrated how the IV amendment currently affords us little privacy protection. | | | |
< < | For the I amendment, the issue is slightly different. It is important to locate the relationship between privacy and freedom of expression in the idea of freedom to read, learn and consume content without monitoring. While there are strong protections for anonymous political speech and freedom of association, these protections have not been explicitly extended to the freedom to read and to receive knowledge. In other words, the freedom to read without extensive monitoring of content has already been threatened by the large-scale behavioral data collection and the I Amendment has not protected the space between the book and the reader at all. | > > | For the I amendment, the issue is slightly different. It is important to locate the relationship between privacy and freedom of expression in the idea of freedom to read, learn and consume content without monitoring. While there are strong protections for anonymous political speech and freedom of association, these protections have not been explicitly extended to the freedom to read and to receive knowledge. The I Amendment has not protected the space between the book and the reader at all. | | | |
< < | We Need Different Services or Different Service Providers | > > | An Alternative: Users Rights | | | |
< < | Different Services
In exploring further the idea about the need for different services, I found myself finally able to see the conceptual link between free software and behavioral tracking. In particular, I realized the relationship between the question about whether software is free vs proprietary and the extent of behavioral data collection designed into the service. | > > | One solution involves relying on the Ninth Amendment’s concept of retained rights, which the people hold on to despite not being articulated in the Constitution. In this case, the specific category of retained rights is users’ rights to digital services. | | | |
< < | When software is proprietary or distributed as Service as a Software Substitute, or simply a networked service that forces users to use non-free software, the software is controlled by companies that are profit-maximizing. In an age where targeted digital advertising is a massive revenue-generating machine, it is often the case that proprietary software is designed to monitor or track user activity because doing so allows these companies to sell users as the product to advertisers. As described, there is a clear relationship between non-free software and the incentive to design software that facilitates behavioral data collection for advertising. | > > | Also, the starting point of any fundamental reconstruction of technology and privacy law must begin with considering what rights users of technology and digital services should have. This seems much more suitable than foisting lackluster privacy protections on individuals after ensuring that corporations can reap large profits through the creation of a large digital economy. | | | |
< < | By contrast, the essence of free software, it seems to me, is to ensure that users of software or internet services, can control the software they use. In addition to the important idea that free software lets individuals customize their computing, there is the added import that free software does not have the profit motive to control the design of the program or service because the source code is released (instead of copyrighted and sold). | > > | In this regard, seeking to extend current data subject access rights under the GDPR to cover a more general concept of “users’ rights” will not be worthwhile, because the starting point for the GDPR was fostering a digital economy. | | | |
< < | The effect of the difference in incentives leads to free software programs without the malware that may monitor, track or otherwise restrict users. Hence, a dramatic improvement of the privacy situation of individuals will be likely if users switched to free software that provided them the same services without the middleman profiting from the provision of computing power that is increasingly cheap and easy to provide in the 21st century. Free software will allow for reading anonymously and freedom from behavioral advertising that currently does not exist in the world of Facebook or Google. | > > | The transactional, individual-centric model of privacy regulation ought to be replaced by an alternative model recognizing that privacy is social or ecological, in the sense that there are externalities not accounted for under the current transactional model. | | | |
< < | However, the question then becomes how to mandate such adoption of free software. Here, any direct regulatory attempt to mandate free software service providers will be met with the full deep-pocketed resistance of the world’s most valuable companies. But, this does not mean applying the philosophy of free software is foreclosed to regulators. | > > | Under a privacy-as-ecology framework, the most important user right is the right not to have one’s behavior extensively monitored and monetized. Large internet platforms learn about our lives by collecting information about our online activity. This includes user browsing on websites other than those owned by the platform – “Facebook can still monitor what they are doing with software like its ubiquitous ‘Like’ and ‘Share’ buttons, and something called Facebook Pixel — invisible code that’s dropped onto the other websites that allows that site and Facebook to track users’ activity”. Platforms monetize our information by selling highly targeted access to users. But why does someone else have to know about every single action we take online, and even profit from that knowledge? Considering that instant-messaging and photo-sharing are, in actual fact, not ground-breaking technological discoveries and relatively simple to provide, users ought to have the right to have easy access to high-quality digital services without selling themselves in order to gain access to the service. | | | |
> > | Second, users of digital services should have the right to customize their internet services or use them as they wish. In this context, the Free Software Movement’s freedom zero is a helpful precedent - the freedom to run programs as the users wish in this context should refer to the user’s right to control how their data is used. This does not mean merely consenting, but rather requiring the design of services and software so that users actually can decide whether they want to see any advertising, decide which kinds of third parties they would like to share their information with and a right to change these settings as the user wishes. In this way, the community of internet users can work to correct or modify existing software, removing malware and improving inferior features. Individuals should be able to create a dislike button or customize their own social media newsfeed, for example. | | | |
< < | Different Services from the Same Providers - Drawing Users Rights from the 'Free Software Freedoms'
In fact, regulators should consider a switch from data protection laws premised on consent to laws that enumerate what rights users should have. It strikes me that some of the freedoms that ensure users control of software can be applied to a law enumerating user’s rights with respect to internet services. For example, the freedom to run programs as the users wish (freedom 0), in this context, should refer to the user’s right to control how their data is used. This does not mean merely consenting, but rather requiring the design of services and software so that users actually can decide whether they want to see any advertising, decide which kinds of third parties they would like to share their information with and a right to change these settings as the user wishes. This could also include a right not to be discriminated against by algorithms or simply a right not to have one’s data collected at all. This sort of control is radically more control than the current privacy control that Mark Zuckerberg suggested all users have. | > > | Role of Regulators
Equally important for the effectiveness of users’ rights is the ability for users’ rights to be enforced. In my view, any digital service provider must have the obligation to ensure users rights are complied with. But outside of steep pecuniary fines for instances where service providers breach these obligations, digital service providers must have the obligation to build these users rights into their digital services. In a sense, this is privacy by design 2.0, where the privacy protections being designed into a system are designing respect for users’ rights.
Additionally, digital service providers should be required to undertake a Privacy Impact Assessment (PIA), taking into account the nature of service as well as how the service provision might impact on the rights to secrecy, anonymity and autonomy of individuals. Similar to an Environmental Impact Statement as required by NEPA and to the Data Protection Impact Assessments as required by Art. 35 of the GDPR, a PIA where users’ rights are the focus, along with strict penalties for false disclosures, could be an effective way of protecting and enforcing users’ rights with respect to digital services.
In conclusion, this represents an alternative to the current mechanism of regulating digital services and platform companies. | | | |
| |
> > |
Amended in light of your comments, Prof. Moglen. - Rohan George (16 June 2018)
| |
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. |
|
RohanGeorge1FirstPaper 2 - 10 Jun 2018 - Main.EbenMoglen
|
|
META TOPICPARENT | name="FirstPaper" |
| |
< < | | | Regulating large-scale Behavioral Collection, ver 2.0
-- By RohanGeorge1 - 21 Apr 2018 | | Different Services from the Same Providers - Drawing Users Rights from the 'Free Software Freedoms'
In fact, regulators should consider a switch from data protection laws premised on consent to laws that enumerate what rights users should have. It strikes me that some of the freedoms that ensure users control of software can be applied to a law enumerating user’s rights with respect to internet services. For example, the freedom to run programs as the users wish (freedom 0), in this context, should refer to the user’s right to control how their data is used. This does not mean merely consenting, but rather requiring the design of services and software so that users actually can decide whether they want to see any advertising, decide which kinds of third parties they would like to share their information with and a right to change these settings as the user wishes. This could also include a right not to be discriminated against by algorithms or simply a right not to have one’s data collected at all. This sort of control is radically more control than the current privacy control that Mark Zuckerberg suggested all users have. | |
> > |
Improvement here is a matter of focus. We can remove some
extraneous material and shorten the presentation of the problem:
"Cambridge Analytica and the resulting furore over the data-sharing
behavior of Facebook will now allow more daring thinking than
before," is the point of departure, and you can put it succinctly.
Free software is not directly relevant, as your text makes clear: in
the economy of services, a concept of "users' rights" is not a
modification of copyright law, but a form of economic regulation at
a deeper structural level. So the definition of those rights isn't
an "it strikes me" matter, but really the heart of the enterprise.
What rights should users possess in the digital services economy?
Defining the rights and explaining the mechanism of their
enforcement is the primary subject. You cannot get everything
explained in detail in 1,000 words, but you can present the core of
your thinking as a more or less finished product.
| |
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. |
|
RohanGeorge1FirstPaper 1 - 21 Apr 2018 - Main.RohanGeorge1
|
|
> > |
META TOPICPARENT | name="FirstPaper" |
Regulating large-scale Behavioral Collection, ver 2.0
-- By RohanGeorge1 - 21 Apr 2018
Introduction
Last term, I wrote an essay suggesting that a more immediate solution was needed to regulate behavioral collection. These short-term solutions, use restrictions and privacy by design, were proposed because I argued that “there is no political will in the ‘West’ to challenge the dominion of the platform companies in the near future”.
The Cambridge Analytica revelations have forced me to reconsider the above argument. It is now an opportune moment to capitalize on the newfound currency of privacy issues and political will to consider regulation of the platform companies in order to remedy the problem of extensive behavioral collection.
While my previous piece described some aspects of current legal failures to protect individual freedom of thought, this essay will first focus on the particular challenges to using existing constitutional provisions to protect privacy and freedom of thought. Second, this essay will consider a structural proposal that addresses freedom of thought directly, through adopting the philosophy behind free software.
Current Constitutional Limitations to regulating the behavioral collection
Neither the Fourth nor First Amendments currently protect privacy sufficiently.
For the IV Amendment, its main protections are against unreasonable searches and seizures of places; in the 21st century, it is identities that are searched. One’s digital footprint, a collection of browsing patterns, pages visited, liked and shared, content on one’s device, are not places; yet they are often searched by police. However, the text of the IV Amendment, interpreted using any number of theories from originalist to formalist to functionalist, does not seem to cover identities. Moreover, one’s reasonable expectation of privacy is surrendered by sharing information with a third party happens whenever we consent to data collection by using mobile apps or services, because our identities are now stored with third parties instead of on our person or in our houses, papers or effects. Consent opens the gateway to extensive behavioral data gathering. Therefore, it has been demonstrated how the IV amendment currently affords us little privacy protection.
For the I amendment, the issue is slightly different. It is important to locate the relationship between privacy and freedom of expression in the idea of freedom to read, learn and consume content without monitoring. While there are strong protections for anonymous political speech and freedom of association, these protections have not been explicitly extended to the freedom to read and to receive knowledge. In other words, the freedom to read without extensive monitoring of content has already been threatened by the large-scale behavioral data collection and the I Amendment has not protected the space between the book and the reader at all.
We Need Different Services or Different Service Providers
Different Services
In exploring further the idea about the need for different services, I found myself finally able to see the conceptual link between free software and behavioral tracking. In particular, I realized the relationship between the question about whether software is free vs proprietary and the extent of behavioral data collection designed into the service.
When software is proprietary or distributed as Service as a Software Substitute, or simply a networked service that forces users to use non-free software, the software is controlled by companies that are profit-maximizing. In an age where targeted digital advertising is a massive revenue-generating machine, it is often the case that proprietary software is designed to monitor or track user activity because doing so allows these companies to sell users as the product to advertisers. As described, there is a clear relationship between non-free software and the incentive to design software that facilitates behavioral data collection for advertising.
By contrast, the essence of free software, it seems to me, is to ensure that users of software or internet services, can control the software they use. In addition to the important idea that free software lets individuals customize their computing, there is the added import that free software does not have the profit motive to control the design of the program or service because the source code is released (instead of copyrighted and sold).
The effect of the difference in incentives leads to free software programs without the malware that may monitor, track or otherwise restrict users. Hence, a dramatic improvement of the privacy situation of individuals will be likely if users switched to free software that provided them the same services without the middleman profiting from the provision of computing power that is increasingly cheap and easy to provide in the 21st century. Free software will allow for reading anonymously and freedom from behavioral advertising that currently does not exist in the world of Facebook or Google.
However, the question then becomes how to mandate such adoption of free software. Here, any direct regulatory attempt to mandate free software service providers will be met with the full deep-pocketed resistance of the world’s most valuable companies. But, this does not mean applying the philosophy of free software is foreclosed to regulators.
Different Services from the Same Providers - Drawing Users Rights from the 'Free Software Freedoms'
In fact, regulators should consider a switch from data protection laws premised on consent to laws that enumerate what rights users should have. It strikes me that some of the freedoms that ensure users control of software can be applied to a law enumerating user’s rights with respect to internet services. For example, the freedom to run programs as the users wish (freedom 0), in this context, should refer to the user’s right to control how their data is used. This does not mean merely consenting, but rather requiring the design of services and software so that users actually can decide whether they want to see any advertising, decide which kinds of third parties they would like to share their information with and a right to change these settings as the user wishes. This could also include a right not to be discriminated against by algorithms or simply a right not to have one’s data collected at all. This sort of control is radically more control than the current privacy control that Mark Zuckerberg suggested all users have.
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines:
Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list. |
|
|
|
This site is powered by the TWiki collaboration platform. All material on this collaboration platform is the property of the contributing authors. All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
|
|