|
META TOPICPARENT | name="FirstPaper" |
|
|
< < | Big Tech Poses a threat to Free Expression |
> > | Big Tech Poses a Threat to Privacy and Free Expression |
| -- By AbdullahiAbdi - 12 Mar 2022 |
|
< < | Introduction |
| |
|
< < | In this essay, I will explore how big technology companies such as Facebook, are undermining the right to freedom of expression especially in relation to their inconsistent moderation policies. |
> > | Technology companies have somehow managed to create a perceived sense of control over how masses communicate over the internet. While technically anyone on earth can stand up a webserver and create communication platforms, the reality is that giant private technology companies such as Facebook are becoming governors of the marketplace of ideas. They control how those who use their platforms may speak. This greatly threatens freedom of expression and speech. |
| |
|
< < | Freedom of expression is one of the most protected human rights by both domestic and international laws, not only because of its particular importance to enhance democratic norms but also as a key right that makes other freedoms including the freedom of assembly and association possible. |
> > | Another thing that is often overlooked is these platforms’ pervasive surveillance of reading people’s private data and patterns of behavior. This again threatens not only freedom of expression but also freedom of thought. Technology should be used to serve the people, rather than using people’s data against them. |
| |
|
< < | Article 19 of the Universal Declaration of Human Rights (UDHR) states “everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers. Article 19 of the International Covenant on Civil and Political Rights (ICCPR), states “everyone shall have the right to freedom of expression; this right shall include freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers, either orally, in writing or in print, in the form of art, or through any other media of his choice.” |
> > | Since I won’t be able to delve into all the aspects and the threats of the pervasive surveillance and how big tech restrict free expression, in this paper, I will explore how Facebook’s enforcement of its content moderation rules curtail free expression. |
| |
|
< < | The First Amendment of the US Constitution also protects freedom of expression including free speech. It guarantees freedom of expression by prohibiting Congress from restricting the press or the rights of individuals to speak freely. |
> > | Facebook’s moderation policy |
| |
|
< < | The First Amendment prohibits the State from restricting freedom of expression, but not from non-state actors such as private companies like Facebook. The evolution of media technology including the internet and how it transformed public square is proving to be problematic. This is because most of the main communication platforms where people rely on to access information and express opinion, are controlled by a number of technology companies who seem to wield a lot of power in controlling, managing and restricting free expression. One can argue that the current legal framework in the US regulating free speech is inadequate because it assumes the government to be the only regulator of the marketplace of ideas. The private actors such as Facebook are themselves governors of the marketplace of ideas. They control and choose who can speak and how those who use their platforms may speak. |
> > | Freedom of expression is a core human right protected under international law and in domestic law systems around the world. It is a core right, encompassing the right to receive, hold, and share opinions, that is essential for democratic governance. But increasingly big social media technology companies are restricting this right through content moderation policies. Moderation policies are a set of rules and practices used by tech companies to regulate content shared on their platforms. |
| |
|
< < | Problems of the moderation policies |
> > | These companies often try to combat what is viewed as harmful content but run the risk of silencing protected speech. Intervening with or removing content affects the rights to freedom of expression and privacy, and can easily lead to censorship. |
| |
|
< < | Because the current legal regime under the First Amendment does not apply to these big tech companies, they seem to have a free license to censor content, limit diversity of expression and manipulate public opinion at their own will. The tech companies often do so by using content moderation policies which are formulated and interpreted by the owners of companies. In the case of Facebook – the Facebook oversight board. In this essay, I will focus only on Facebook and will describe the shortcomings of its moderation policy and how it curtails free expression. |
> > | There are three major problems associated with Facebook’s moderation policies and practices. First; there is overreliance on automated algorithms leading to incorrect removals. Second; there is often no clear appeal process for negatively impacted individuals and thirdly; there is undue government manipulation of Facebook leading to censorship. |
| |
|
< < | Before I joined Columbia Law School, I worked on a project that documented how authorities in Somalia manipulated Facebook by reporting journalists who were critical to the Somali government. Facebook responded by permanently deleting the accounts of those journalists. In all the cases, the suspension of the journalist’s accounts was final and there was no appeal against that decision. Journalists were informed that they violated Facebook Community Standards which almost all of them were not familiar with. I found that the accounts of those journalists were the main source of independent information to the Somali public. The deletion of their accounts hindered public access to information. This is problematic because censoring critical views on broad interpretation of community standards grounds that are not subject to public scrutiny limits free expression. The fact that in this case, government officials were involved in reporting the accounts to Facebook shows that there can be overlap in Facebook actions when it acts as a private company and when it colludes with States to censor free expression specially in jurisdictions outside of US. In that case, the question is whether we need government regulation and oversight over such behavior? |
> > | Before I joined Columbia Law School, I worked on a project that documented how Facebook deleted several accounts of Somali journalists for allegedly violating Facebook’s Community Standards. All the journalists whose Facebook accounts were disabled said they did not understand why their accounts were shutdown. They were not also given prior warnings and were not afforded an opportunity to appeal against the removal of their accounts. |
| |
|
< < | Another way Facebook uses to limit free expression is on incitement to violence grounds like their decision to ban former President Donald Trump following the insurrection at the US Capitol on January 6, 2021. While incitement to violence could be a ground to limit free expression there is inconsistency on how Facebook applies its content moderation policy. A few days ago, Facebook decided to allow some calls for violence against Russian invaders in Ukraine. This new policy also allows users to call for the death of Russian President Putin in some countries. This is not the only time Facebook changed its content moderation rules. It previously created an exception to its hate speech rules for world leaders but was never clear which leaders got the exception or why. |
> > | My investigation revealed two main reasons as to why these accounts were deleted by Facebook. |
| |
|
< < | Recommendations |
> > | Use of automated algorithms |
| |
|
< < | Since Facebook is an important forum of mass communication and its policies could impact millions of people across the word, there should be some transparency mandate about their policy formation. Such mandate should require Facebook to share more information to the public, researchers and government on how they are making content moderation policies. Their ability to allow their platforms to be used to call for incitement and spread misinformation should also be subjected to scrutiny. |
> > | First, some of the journalists whose accounts were deleted were flagged through automated algorithms. AutoŽmated removals have long been critiŽcized as more susceptŽible to error than human reviewŽers. These mistakes particŽuŽlarly affect FaceŽbook users outside westŽern counŽtries since Face-book’s algorithms only work in certain languages, and autoŽmated tools often fail to adequately account for context or politŽical, cultural, linguistic, and social differŽences. |
| |
|
< < | Facebook should also not allow itself to be manipulated by government officials to restrict free expression of critical voices to governments. There should be increased due diligence when assessing purported infringement of Community Standards. Lastly, there should be a clear appeal process for individuals including journalists, human rights activists and even politicians whose Facebook accounts are suspended or permanently deleted. |
> > | One way Facebook is trying to address this problem is by employing content moderators to conduct the review, but recent media reports revealed that even where Facebook employ people to do this work, the staff are overworked and unable to make individualized decisions on these contentious and complex issues in the time afforded. For example, in one of its facilities in Africa, Facebook required its staff to moderate hours and minutes long videos and other content and make decisions within one minute. |
| |
|
> > | Government interference
The other reason is government officials reporting critical journalists and individuals to Facebook.
Somali government officials were deliberately reporting the accounts of independent journalists critical of the authorities to Facebook and claiming these accounts violated Facebook Community Standards. Facebook appeared to have been complying with these requests without adequate investigations. For example, the Facebook account of a prominent government critic was incorrectly disabled and was only reactivated after we questioned Facebook on why his account was disabled in the first place.
In removing these accounts—which often provided the only source of independent information to people in Somalia where the press is severely restricted by the government —Facebook was essentially facilitating state censorship.
International human rights standards
International human rights standards require private companies to ensure that human rights and due process considerations are integrated at all stages of their operations. The UN Guiding Principles on Business and Human Rights for example, requires all business companies including those in the technology sector like Facebook to respect human rights.
Civil society groups have been thinking of ways to address this phenomenon and have come up with possible solutions. The 2018 Santa Clara PrinŽciples, for example, are a civil sociŽety charter that outline minimum standŽards for companŽies engaged in content moderŽaŽtion and sets out five foundational principles including (1) human rights and due process considerations; (2) application of understandable rules and polices; (3) integration of cultural competence; (4) recognition of risks of State involvement in content moderation; and (5) integrity and explainability of the policies. The charter also sets out three operational principles that should guide moderation policies. These include, more transparency in content moderation—requires companies to publicly report the actions taken by companies to restrict or remove content—; give notice to individuals whose content is moderated; and to establish and appeal process for anyone who feels aggrieved as a result of a moderation policy.
Recommendations to Facebook
Since Facebook is an important forum of mass communication and its policies are impacting many people across the word, there should be some transparency mandate about their policy formation. Such mandate should require Facebook to share more information with the public, researchers and government on how they make content moderation policies.
Facebook should also not allow itself to be manipulated by government officials to restrict free expression of critical voices to authorities. There should be increased due diligence when assessing purported infringement of Community Standards. Lastly, there should be a clear appeals process for individuals including journalists, human rights activists and even politicians whose Facebook accounts are suspended or permanently deleted. |
| |
|
< < |
This draft could be improved by adding some context. Technical context would involve explaining that anyone on earth can stand up a webserver, so the supposed control over free expression reduces to "they have a popular way to do it, but no actual power of exclusion whatever." It would also take into account the platforms' pervasive surveillance of reading, which threatens not freedom of expression but freedom of thought. This would take emphasis away from moderation policy, which is basically a side issue, and put the scrutiny where it belongs. |
| |
|
< < | Government interference in platforms' operation, on the other hand,. is no different from government interference in the operation of newspapers and broadcasters, except that even brutal governments have to make nice with the platforms if they don't want to bloack them altogether, which they mostly don't. So the analysis should, once again, not be devoted to the platforms, but to the fact that governments that don't respect rights don't respect rights unless it suits them. That has nothing to do with Zuckerberg whatever.
|
|
|