AikenLarisaSerzoSecondEssay 3 - 17 Jan 2022 - Main.AikenLarisaSerzo
|
|
META TOPICPARENT | name="SecondEssay" |
| |
< < | Privacy Regulations: Beyond the Legal Exploitation of Data | > > | Beyond Conset-Based Privacy Regulations | | | |
< < | -- By AikenLarisaSerzo - 10 Dec 2021 | > > | -- By AikenLarisaSerzo - 16 Jan 2022 | |
Introduction | |
< < | Rather than encouraging ethical software and data processing, existing privacy regulations legitimize and empower data exploitation of private enterprises. The start of December brings with it a deluge of people sharing their Spotify Wrapped -- an end-of-the-year feature provided by Spotify which summarizes the listening habits of its users. It provides information -- in digestible 5-second clips -- on the top songs, artists and podcasts streamed, the number of minutes spent streaming songs, and comparative insights. Users are then given the option to share the insights in various social platforms. Incredible how Spotify made everyone immediately and proudly share insights gathered about them by a private company with laser-like precision. | > > | Rather than encouraging ethical software and data processing, existing privacy regulations legitimize and empower data exploitation of private enterprises. The start of December brings with it a deluge of people sharing their Spotify Wrapped -- an end-of-the-year feature provided by Spotify which summarizes the listening habits of its users. It provides information -- in digestible 5-second clips -- on the top songs, artists and podcasts streamed, the number of minutes spent streaming songs, and comparative insights. Users are then given the option to share the insights in various social platforms. Incredible how Spotify made everyone immediately and proudly share insights gathered about them by a private company with laser-like precision. | | Spotify, like other tech companies, made the collection and processing of personal information acceptable, if not natural. Users voluntarily share data in exchange for insights and convenience.
The GDPR and regulations like it makes processing of personal information legal provided that consent is obtained. Countries that enacted general privacy laws, like the Philippines, have largely adopted consent-based and accountability principles. The Philippines actually made it stricter by making violations criminal. This incentivized tech companies to engage lawyers, conduct audits, and draft privacy policies. This results in lengthy and legalese consent forms, which Zuboff aptly calls surveillance policies. | |
> > | The implementation of audits and privacy policies renders the processing of personal information legal provided that the extent of such processing is captured in the consent forms. However, privacy regulations only appear to legitimize the comprehensive and unscrupulous scraping and processing of personal data.
In the Philippines, the strict privacy laws have not prevented companies from engaging in the wide scale exploitation of data. During the height of the pandemic, Huawei provided local governments with AI technology that enables doctors to identify probable COVID cases through the patients’ CT scans. The Philippines essentially volunteered itself to be the testing ground (and an additional resource of data sets) for Huawei’s algorithms. A basic version of Facebook may also be accessed by users for free. Such an act, disguised as philanthropy, allows Facebook to collect behavioral data about the population. The political polarization of the population, as described by Maria Ressa, and the prevalence of misinformation clouding the upcoming elections could be manifestations of the echo chambers enabled and empowered by Facebook. | | | |
< < | The implementation of audits and privacy policies renders the processing of personal information legal provided that the extent of such processing is captured in the consent forms. However, it is doubtful whether these actually enhance privacy rights. If any, privacy regulations only appear to legitimize the comprehensive and unscrupulous scraping and processing of personal data. | | | |
< < | In the Philippines, the strict privacy laws have not prevented companies from engaging in the wide scale exploitation of data. During the height of the pandemic, Huawei provided local governments with AI technology that enables doctors to identify probable COVID cases through the patients’ CT scans. The Philippines essentially volunteered itself to be the testing ground (and an additional resource of data sets) for Huawei’s algorithms. A basic version of Facebook may also be accessed by users for free. Such an act, disguised as philanthropy, allows Facebook to collect behavioral data about the population in a manner similar to what the tech giant did in India.
No, they didn't. India's Telecommunications Regulatory Authority—after the public campaign one of our contributions to which you cite—prohibited zero-rating practices.
The political polarization of the population, as described by Maria Ressa, and the prevalence of misinformation clouding the upcoming elections could be manifestations of the echo chambers enabled and empowered by Facebook.
Inadequacies of Clickwraps and Consent Forms Forms
Consent-based regulations are not sufficient to enhance privacy rights. These laws have unrealistic assumptions about data subjects and controllers. | > > | Possible Solutions | | | |
< < | The regulations assume that users have the ability and the resources to understand consent forms. The general population do not understand what rights they are giving away. The consent provided is far from being an informed one. | | | |
< < | The system created by the regulations requires users to read and understand consent forms that accompany the services they access and use. Given the resulting length and language of such forms, and the number of services accessed by a user, the regulations presuppose that users will be able to spend the necessary time and resources to understand the forms. | > > | Existing consent-based regulations will not be sufficient to protect privacy rights of individuals. It is difficult to regulate behavior when there is no guarantee that the subject matter of the regulations possesses the required competencies and resources to understand the extent of the consent they are giving. Even the drafters of consent forms will not be able to completely capture the extent of their processing activities. | | | |
< < | Assuming that a user actually studies the consent forms, it is impossible for users to negotiate the terms of a service’s product or services. The only choice is for the user to refrain from using the problematic product or services. | > > | Other methods should be implemented to supplement existing regulations: | | | |
< < | Whether the user actually has a “choice” is another question. If a user wants to use a music streaming service, all the popular providers (Spotify, Apple Music, Youtube Music) would most likely have the same comprehensive and disproportionate terms in its consent forms that would allow said providers to process and share the user’s personal information both to enable the delivery of the service and more. The choice to pick a service, therefore, is illusory. | > > | End user-facing online service providers should be prohibited by regulation, by the country where data subjects reside, from (i) conducting expansive and general processing of data, and (ii) lumping with its general consent form, provisions related to the processing of personal data for purposes beyond the services which a user is primarily signing up for, and processing for the monetizing personal data through cross selling or upselling products from itself or third parties. | | | |
< < | The regulations further assume that the entities controlling and processing data understand the breadth and depth of the processing activities they want to do and that they can synthesize this in consent forms. Technology and business models rapidly develop. It is unrealistic to expect that the consent forms and the users’ informed consent will also be updated in lock step. | > > | Service providers usually include blanket provisions that would allow them to use personal data “to improve services or products”. The use of personal data vis a vis improvement of services should be limited to specific risks and harms to which a user is exposed to: virus or spam risks; customer service provision. | | | |
< < | Possible Solutions | > > | In the interest of providing users with autonomy over their personal data, users may still be allowed to actively allow other third parties to access data collected by another party. However, such sharing should actively be initiated by the user, not the processor. Further and more importantly, any functionality or prompt within a service that would implement such user-initiated command, must be presented to the user in a manner that is clear and apparent, separate from the general consent of the provider. | | | |
< < | Regulations will not be sufficient to protect privacy rights of individuals. It is difficult to regulate behavior when there is no guarantee that the subject matter of the regulations possesses the required competencies and resources to understand the extent of the consent they are giving. Even the drafters of consent forms will not be able to completely capture the extent of their processing activities. | > > | To ensure compliance with the foregoing limitations and because it’s impractical to predict how technology will evolve and how data will be processed, an independent government agency must be empowered to regularly evaluate online services’ data processing activities. The agency should identify what services a user obviously signed up for when onboarding with the provider. Processing activities conducted which include expansive and general processing, or those which are not directly necessary to the fulfillment of the provider’s apparent obligations to the user should be prohibited by government. To illustrate, if the user signed up to access music streaming services, data processing for the purpose of cross selling logistics or delivery services, or collection of data for purposes of credit scoring, should be prohibited. To mitigate corruption, the government agency should be headed by a panel of unelected officials that have staggered terms that go beyond the appointing authority’s. | | | |
< < | Other solutions should be implemented to supplement regulations. Overregulation may be detrimental as policy makers may lack the necessary understanding of technology due to the velocity at which it is changing. Regulations should not be too narrow to restrict innovation but should neither be too broad as to be a mere scrap of paper. Technology outpaces regulations. Government and other stakeholders should be provided with a space to test out technologies and understand how these can affect privacy. The solution proposed in class -- that a government agency be created to to perform this -- is a step towards the right direction. Such an agency should be manned by competent and technical people, but at the same time, should be insulated from the influence of Big Tech. (This may be difficult to do in countries where tech talent is hogged by private enterprise due to the unappealing nature of government work.) | > > | Providers that refuse to comply with directives provided by the agency should be blocked domestically, making it harder for the provider to gain users and revenue from the relevant jurisdiction. National governments may have some leverage to successfully implement this against offshore providers, especially those that earn revenue from local customers. Revenue earned by offshore platforms still needs to go through the domestic banking system. Any infraction may be pursued and pressure may be done by blocking off the providers’ access to the banking systems. | | | |
< < | The foregoing initiative will only be sustainable if the services of Big Tech are replaced by software that is understandable. No software should be allowed to proliferate as black boxes. Software, to be democratic and enable the protection of individual rights, should be transparent, modifiable, and its control should not be limited to a few. The combination of regulation, opensource software, and tech education should work together to encourage the use of technology that protects individual privacy. | | | |
< < | The consis | > > | Conclusion
Consent-based regulations are not sufficient to enhance or protect privacy rights. These laws have unrealistic assumptions about data subjects and controllers. The regulations assume that users have the ability and the resources to understand consent forms. The consent provided is far from being an informed one. | | | |
< < |
Apparently you did not proofread. | > > | If the objective is to protect the freedoms of individuals, the ability of service providers to process data must be curtailed in a more aggressive way. | | | |
< < | I think the best route to improvement would be to focus on your own idea, the one you want the reader to come away with. There are many ideas here, most of which are familiar to me. Data protection law protects data trading, not people. Bilateral consent is not a sufficient basis for dealing with environmental hazards. Corrupt, venal governments behaving with or tolerating lawless violence will not protect human rights to privacy no matter what the law in the books says. Development of inexpensive powerful computers and free software provides individuals some technical means to protect their freedom. These are each adequately stated in a sentence or two, leaving plenty of room for the development of your own idea.
| |
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. |
|
AikenLarisaSerzoSecondEssay 2 - 03 Jan 2022 - Main.EbenMoglen
|
|
META TOPICPARENT | name="SecondEssay" |
| |
< < | | | | |
< < | It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted. | | Privacy Regulations: Beyond the Legal Exploitation of Data | | The implementation of audits and privacy policies renders the processing of personal information legal provided that the extent of such processing is captured in the consent forms. However, it is doubtful whether these actually enhance privacy rights. If any, privacy regulations only appear to legitimize the comprehensive and unscrupulous scraping and processing of personal data. | |
< < | In the Philippines, the strict privacy laws have not prevented companies from engaging in the wide scale exploitation of data. During the height of the pandemic, Huawei provided local governments with AI technology that enables doctors to identify probable COVID cases through the patients’ CT scans. The Philippines essentially volunteered itself to be the testing ground (and an additional resource of data sets) for Huawei’s algorithms. A basic version of Facebook may also be accessed by users for free. Such an act, disguised as philanthropy, allows Facebook to collect behavioral data about the population in a manner similar to what the tech giant did in India. The political polarization of the population, as described by Maria Ressa, and the prevalence of misinformation clouding the upcoming elections could be manifestations of the echo chambers enabled and empowered by Facebook. | > > | In the Philippines, the strict privacy laws have not prevented companies from engaging in the wide scale exploitation of data. During the height of the pandemic, Huawei provided local governments with AI technology that enables doctors to identify probable COVID cases through the patients’ CT scans. The Philippines essentially volunteered itself to be the testing ground (and an additional resource of data sets) for Huawei’s algorithms. A basic version of Facebook may also be accessed by users for free. Such an act, disguised as philanthropy, allows Facebook to collect behavioral data about the population in a manner similar to what the tech giant did in India.
No, they didn't. India's Telecommunications Regulatory Authority—after the public campaign one of our contributions to which you cite—prohibited zero-rating practices.
The political polarization of the population, as described by Maria Ressa, and the prevalence of misinformation clouding the upcoming elections could be manifestations of the echo chambers enabled and empowered by Facebook. | | Inadequacies of Clickwraps and Consent Forms Forms | | The consis | |
> > |
Apparently you did not proofread.
I think the best route to improvement would be to focus on your own idea, the one you want the reader to come away with. There are many ideas here, most of which are familiar to me. Data protection law protects data trading, not people. Bilateral consent is not a sufficient basis for dealing with environmental hazards. Corrupt, venal governments behaving with or tolerating lawless violence will not protect human rights to privacy no matter what the law in the books says. Development of inexpensive powerful computers and free software provides individuals some technical means to protect their freedom. These are each adequately stated in a sentence or two, leaving plenty of room for the development of your own idea.
| |
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines: |
|
AikenLarisaSerzoSecondEssay 1 - 10 Dec 2021 - Main.AikenLarisaSerzo
|
|
> > |
META TOPICPARENT | name="SecondEssay" |
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.
Privacy Regulations: Beyond the Legal Exploitation of Data
-- By AikenLarisaSerzo - 10 Dec 2021
Introduction
Rather than encouraging ethical software and data processing, existing privacy regulations legitimize and empower data exploitation of private enterprises. The start of December brings with it a deluge of people sharing their Spotify Wrapped -- an end-of-the-year feature provided by Spotify which summarizes the listening habits of its users. It provides information -- in digestible 5-second clips -- on the top songs, artists and podcasts streamed, the number of minutes spent streaming songs, and comparative insights. Users are then given the option to share the insights in various social platforms. Incredible how Spotify made everyone immediately and proudly share insights gathered about them by a private company with laser-like precision.
Spotify, like other tech companies, made the collection and processing of personal information acceptable, if not natural. Users voluntarily share data in exchange for insights and convenience.
The GDPR and regulations like it makes processing of personal information legal provided that consent is obtained. Countries that enacted general privacy laws, like the Philippines, have largely adopted consent-based and accountability principles. The Philippines actually made it stricter by making violations criminal. This incentivized tech companies to engage lawyers, conduct audits, and draft privacy policies. This results in lengthy and legalese consent forms, which Zuboff aptly calls surveillance policies.
The implementation of audits and privacy policies renders the processing of personal information legal provided that the extent of such processing is captured in the consent forms. However, it is doubtful whether these actually enhance privacy rights. If any, privacy regulations only appear to legitimize the comprehensive and unscrupulous scraping and processing of personal data.
In the Philippines, the strict privacy laws have not prevented companies from engaging in the wide scale exploitation of data. During the height of the pandemic, Huawei provided local governments with AI technology that enables doctors to identify probable COVID cases through the patients’ CT scans. The Philippines essentially volunteered itself to be the testing ground (and an additional resource of data sets) for Huawei’s algorithms. A basic version of Facebook may also be accessed by users for free. Such an act, disguised as philanthropy, allows Facebook to collect behavioral data about the population in a manner similar to what the tech giant did in India. The political polarization of the population, as described by Maria Ressa, and the prevalence of misinformation clouding the upcoming elections could be manifestations of the echo chambers enabled and empowered by Facebook.
Inadequacies of Clickwraps and Consent Forms Forms
Consent-based regulations are not sufficient to enhance privacy rights. These laws have unrealistic assumptions about data subjects and controllers.
The regulations assume that users have the ability and the resources to understand consent forms. The general population do not understand what rights they are giving away. The consent provided is far from being an informed one.
The system created by the regulations requires users to read and understand consent forms that accompany the services they access and use. Given the resulting length and language of such forms, and the number of services accessed by a user, the regulations presuppose that users will be able to spend the necessary time and resources to understand the forms.
Assuming that a user actually studies the consent forms, it is impossible for users to negotiate the terms of a service’s product or services. The only choice is for the user to refrain from using the problematic product or services.
Whether the user actually has a “choice” is another question. If a user wants to use a music streaming service, all the popular providers (Spotify, Apple Music, Youtube Music) would most likely have the same comprehensive and disproportionate terms in its consent forms that would allow said providers to process and share the user’s personal information both to enable the delivery of the service and more. The choice to pick a service, therefore, is illusory.
The regulations further assume that the entities controlling and processing data understand the breadth and depth of the processing activities they want to do and that they can synthesize this in consent forms. Technology and business models rapidly develop. It is unrealistic to expect that the consent forms and the users’ informed consent will also be updated in lock step.
Possible Solutions
Regulations will not be sufficient to protect privacy rights of individuals. It is difficult to regulate behavior when there is no guarantee that the subject matter of the regulations possesses the required competencies and resources to understand the extent of the consent they are giving. Even the drafters of consent forms will not be able to completely capture the extent of their processing activities.
Other solutions should be implemented to supplement regulations. Overregulation may be detrimental as policy makers may lack the necessary understanding of technology due to the velocity at which it is changing. Regulations should not be too narrow to restrict innovation but should neither be too broad as to be a mere scrap of paper. Technology outpaces regulations. Government and other stakeholders should be provided with a space to test out technologies and understand how these can affect privacy. The solution proposed in class -- that a government agency be created to to perform this -- is a step towards the right direction. Such an agency should be manned by competent and technical people, but at the same time, should be insulated from the influence of Big Tech. (This may be difficult to do in countries where tech talent is hogged by private enterprise due to the unappealing nature of government work.)
The foregoing initiative will only be sustainable if the services of Big Tech are replaced by software that is understandable. No software should be allowed to proliferate as black boxes. Software, to be democratic and enable the protection of individual rights, should be transparent, modifiable, and its control should not be limited to a few. The combination of regulation, opensource software, and tech education should work together to encourage the use of technology that protects individual privacy.
The consis
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.
To restrict access to your paper simply delete the "#" character on the next two lines:
Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list. |
|
|