|
META TOPICPARENT | name="FirstEssay" |
|
| My worries about having no say in the use of my personal data.
|
|
< < |
If we were to ask people from our generation
In the minority of the human race that is wealthy....
if they own a smartphone, use the internet on a daily basis or have social media accounts, their answer would be yes without any hesitation.
But what they and I haven't realised when using those technologies, is that we share our personal information either intentionally or unintentionally with private or public entities worldwide, that tend to violate our right to privacy by selling information without our consent to companies we haven't even heard about. |
| |
|
< < | Any information related to us, in our private or professional life, that we share online, constitutes our personal data.
Not necessarily. The information might not have anything to do with you at all. Once again, you might want to distinguish between the information distributed and the fact that you are either disseminating or receiving it. And you might be more attentive to the receiving activity rather than the transmitting activity, precisely because if you are, it will destabilize the structure of the current draft's argument.
|
> > | If we were to ask quite wealthy people from our generation if they own a smartphone, use the internet on a daily basis or have social media accounts, their answer would be yes without any hesitation.
But what they and I haven't realised when using those technologies, is that we share our personal information either intentionally or unintentionally with private or public entities worldwide, that tend to violate our right to privacy by selling information without our consent to companies we haven't even heard about. Personal data are any anonymous data that can be double checked to identify a specific individual. |
|
Those data are precious for companies because they reveal our preferences, our likes that can be sold to other companies to increase their sell. |
| It seemed a good way to act posteriorly and find a way to correct our past mistakes. But can we really rely on it? It did not take long before we could see the limit of this regulation.
In fact, the ECJ issued a ruling on September 25, 2019, regarding the implementation of the EU's right to be forgotten. |
|
< < | This dispute involved Google and France's data authority (CNIL). The ruling held that CNIL can compel Google to remove links to offending material for users that are in Europe, but can't do that worldwide. We should not be shocked with the ECJ ruling, it was foreseeable since a different ruling could have been viewed as an attempt by Europe to police an US tech giant beyond US borders. |
> > | This dispute involved Google and France's data authority (CNIL). The ruling held that CNIL can compel Google to remove links to offending material for users that are in Europe, but can't do that worldwide. We should not be shocked with the ECJ ruling,that denies all extraterritorial effect to any EU measures that want to implement the duty to forget coercively. It was foreseeable since a different ruling could have been viewed as an attempt by Europe to police an US tech giant inside US borders. |
| |
|
< < |
No. It upholds the right to police the US company beyond US borders. Perhaps you meant inside US borders. That is, to displace US law in the US. But the point is to deny all extraterritorial effects to EU orders coercively implementing the duty to forget or censoring indexes through deindexing orders. As I emphasized in class, SFLC.in, the Indian sister organization to my own Software Freedom Law Center, intervened in the EJC against CNIL, from a purely non-US perspective.
|
|
When we read through the decision, we see another underlying problem: how can right to privacy and freedom of information work together. France asked the Court to extend the right to be forgotten universally to people outside the EU. Google argued that such a ruling may result in global censorship and infringement of freedom of information rights.
The real issue at stake is: can we impose on other a duty to forget? |
| |
|
< < | What about freedom of information. What else can we do? |
> > | What about freedom of information in the context of censorship ? |
| |
|
< < | As said before, one good aspect of GDPR is that it creates a fast process to erase the data that Internet companies collect and store for use in profiling and targeted advertising. But on other side it can be used to erase online content, whether or not that content actually violates anyone else's rights. Such a use could constitute a violation of freedom of information with is part of the fundamental right of freedom of expression. Those rights are recognised in international law, as in the article 19 of the Universal Declaration of Human rights. It is true that there are some very important concerns about data protection and privacy in face of mass collection of our data by companies, but the way "right to be forgotten" is built is not appropriate. |
| |
|
< < | First of all, freedom of expression for the public interest is essential, even for information obtained unlawfully. In fact, information may not have been accessible otherwise because it was kept secret, but for the good of the society, they have been revealed. Allowing a right to forget for those data would be harmful for the society.
One other argument would be that allowing people to have some links related to their name delated could become a way to hide the truth and give a false picture of who they are. Imposing on people that are seeking information on other a duty to forget on some other's information is not the good answer. Individuals have a right to access all the information available, and past mistakes should not be forgotten, but used as examples.
Maybe one answer to the problem would be to allow a right of correction, to reply that restricts less the freedom of expression, compares to the right to be forgotten. It would enable individuals to present themselves as they really are and correct false information about them. |
> > | GDPR can be used to erase online content, whether or not that content actually violates anyone else's rights. Such a use could constitute a violation of freedom of information with is part of the fundamental right of freedom of expression.Those rights are recognised in international law, as in the article 19 of the Universal Declaration of Human rights. It is true that there are some very important concerns about data protection and privacy in face of mass collection of our data by companies, but the way "right to be forgotten" is built is not appropriate. In fact freedom of expression for the public interest is essential, even for information obtained unlawfully.
In my opinion, the right to be forgotten as designed by the GDPR is in total opposition with the freedom of expression. The right to ask search engines to de-index web pages, as well as the right of erasure, is encompassed in this legislation. What I think is problematic is that under article 17, it’s data controllers (usually search engines) which are the initial adjudicators of requests. This is problematic because search engines do not own the content that the individual is asking to have removed. Editorial decisions must rest with publishers — not tech companies. Otherwise, it can evolve into a form of censorship, since as individuals we don’t have control over what is removed. Furthermore, it is on our behalf, that states have a right to censor indexes, using deindexing orders. Once again it questions about censorship: states impose us a duty to forget those indexes. We can’t say anything. Why can’t we deal with our information the way we want? Why do we need a third part to be involved?
What should be done in order to change that? EU law isn’t providing ways to effectively protect privacy and individuals often part with their information without knowing that they have surrendered some privacy.
The answer to the problem is maybe to forbid the right to be forgotten, since it does more harm than good. From a physiological or neurological point of view, no one can be forced to forget. Privacy is a very important right that must be protected, but there are limits. If you did something, you did it. If something was published, it cannot be unpublished.
Another argument is to know where does the right to be forgotten fit into a world that functions through blockchain which is designed precisely to record everything permanently? |
| |
|
< < | Finally, instead of using a right to be forgotten that does more harm to freedom of expression, we could use other remedies such as going to court which will decide if the information will remain available to the public society or not. |
| |
|
< < |
This is still judicial censorship, is it not?
|
| |
|
< < | We could also use mechanisms available on social media platforms that helps to identify harmful content and then can be removed by those platforms. |
|
|
|
< < | The draft does a good job explaining existing European legal phenomena. It also clearly voices the basic free expression arguments that limit the use of censorship orders in the interest of privacy. But the draft lacks a clear idea of your own to be placed in dialogue with these exterior sets of ideas, which is why the conclusion find you basically throwing up your hands. Making the draft stronger means bringing your ideas to the front. You can say what you think, show how your idea emerges from your understanding of the two other points of view, then providing some real conclusion on which the reader can then base her own further thinking.
As I said in the early lines of the draft, by abstracting from the narrow "right to be forgotten" fact pattern, you can make progress. Censorship orders seem useful, whatever their problems in principal, only so far as we concern ourselves with searching: in the end, it's about the state's right to censor indexes on behalf of individuals. But if you start from the role of the same parties in surveiling what everyone reads, it is rapidly apparent that censoring indexes, or indeed censoring content directly, will not address the center of the problem at all. |
| |
|
< < | Once it is apparent that the orders censoring indexes are a minor response to an infinitesimal part of the problem, one can no longer cast the issue as "right to be forgotten" against "free expression." That makes much more room for your own ideas. |
|
|