TWikiGuestFirstEssay 17 - 09 Oct 2020 - Main.KjSalameh
|
|
META TOPICPARENT | name="WebPreferences" |
| |
< < | People used to rely on news outlets to know what's happening around the world; now, most of us get our news from social media. Because we share articles that fit in with our beliefs and connect with others who have similar views to our own, the news we get becomes recycled based on our previous biases. Here lies the ultimate danger of social media: our news has become so polarized that we lack the exposure necessary for an advancement in society. | | | |
< < | The idea of our influences directing us toward belief and action is not new. Gustave Le Bon, one of the greatest philosophers dedicated to the work on crowd psychology, makes the case that since the dawn of time we have always been under the influence of religious, political, and social illusions. He states that because the masses have always been under these influences, we are ingrained to seek out an illusion to grasp to under any and all circumstance. He states that philosophers in the 19th century have worked greatly to destroy these illusions, but have not been able to provide the masses with any ideal that could sway them. Due to this, the masses now flock to whichever rhetorician wets their appetites. It seems social media has become a universal outlet to which we grasp onto our illusions, as Le Bon mentions, refusing to diversify ourselves to viewpoints that differ than our own and thereby narrowing our visions of reality and widening the divisions we have from one another and, perhaps even, from truth. | | \ No newline at end of file | |
> > | Polarization and the Division of Society
Social Media and the Masses
Process of Polarization and Potential for Progress
Polarization and the Division of Society
People used to rely on news outlets to know what's happening around the world; now, most of us get our news from social media. For example, we used to read articles to determine 'what' is happening, and then we used to think for ourselves on 'why' this happened and 'how' we should feel about it. Now, with convenience at our fingertips, we are in the midst of a reversal. The politics we are partial to already define for us 'what' has happened. This is a product of the increasing bipolarization and division in our society. To a large degree we already know what we will believe and what we will not accept, establishing a dangerous dichotomy of thought. Along these lines, the convenience of social media - the content of which is continually shaped by unseen forces and algorithms that prey on our technological footprints - has fed into this dichotomy. Now most follow their news to better understand the 'how'--how should we feel? How should we react? What fits the narrative of the rhetoric we've already accepted? And because we share articles that fit in with our beliefs and connect with others who have similar views to our own, social media makes it easy for us to bolster this mindset of finding support for our biases rather than allow new information to broaden our insights. The news we intake becomes recycled based on our previous biases. Here lies the ultimate danger of social media without due regulation: the guided polarization of digital news only exasperates the existing divisions in our society.
Social Media and the Masses
The idea of our influences directing us toward belief and action is not new. Le Bon, a polymath dedicated to the work on crowd psychology, makes the case that since the dawn of time we have always been under the influence of religious, political, and social illusions (See The Crowd: A Study of the Popular Mind). He states that because the masses have always been under these influences, we are ingrained to seek out an illusion to grasp to under any and all circumstances. He noted that while philosophers in the 19th century have worked greatly to destroy these illusions, they have not been able to provide the masses with any ideal that could effectively sway them. Accordingly, the masses now flock to whichever rhetorician wets their appetites. Le Bon may have written his seminal work at the turn of the 20th century, but his words seem appropriate now more than ever. Social media has become a universal outlet to which we grasp onto our illusions, and refuse to diversify ourselves to viewpoints that differ than our own. Living in this new digital age, we are thereby narrowing our visions of reality and widening the divisions we have from one another and, perhaps even, from truth. Truth itself has become fragmented, relying on the whims of the reader. All the while most of us remain clueless to the puppeteers behind the curtains.
It's natural for our experiences to dictate our way of thinking in the Lockean framework of epistemology, but the problem with polarization in social media today is that it leaves little to no room for genuine discourse. What social media offers us is a steady and consistent affirmation from our peers who think similarly to us. Social media is intrinsically designed to connect us with others who will encourage our way of thinking, even if our logic is flawed or our news misguided. In other words, for many social media has made home to a great convenience of getting the assurance we want from others who already agree with us that productive speculation or positive self-doubt becomes a foreign process. Many people then become so encouraged by their opinions that they begin to confuse them for facts. In order to bridge the gaps in our society, we must, at the very least, understand the diverse markup of our communal struggle for survival.
Process of Polarization and Potential for Progress
Social media and similar digital mediums largely influence our thinking through targeted advertisements. Every time we swallow the mental pill on Facebook, Reddit, and the like, the databases on those sites store our personal and private data to their advantage, keeping close track of what we search and what our interests are. This misuse of our privacy and the self-selective filter bubbles social media creates for us works to keep the masses addicted. We connect with others who have beliefs aligning to our own, we 'like' their posts and share their posts, and without second though allow behemoth companies to track our personal information and internet consumption tendencies. Social media works by continuing to offer us exposure to our interests; unfortunately this is the problem. Since we are more likely to accept ideas that align with our pre-existing beliefs, and thus continue to scroll down our social media feeds, the posts that pop up first on our accounts are the news sources that work with our existing confirmation biases. Under such a system, what should be expected except for a widening of the rifts that divide us?
If we want our society to progress more efficiently towards unity, we must depolarize our social media. To do this, we must begin by introducing legislation and regulation that prevents companies from providing overly filtered access to misguided illusions. It is not enough to fault the masses alone. If we read more articles from various news sources, share those with friends that hold our current viewpoints and create further connections to others with entirely different perspectives, we may begin to undo the process of polarized information that has so heavily influenced our social media and negatively impacted our society. But to be truly successful, we must target the unseen as much as the obvious. |
|
TWikiGuestFirstEssay 16 - 09 Oct 2020 - Main.KjSalameh
|
|
META TOPICPARENT | name="WebPreferences" |
| |
< < | 1st draft of the 1st essay | | \ No newline at end of file | |
> > | People used to rely on news outlets to know what's happening around the world; now, most of us get our news from social media. Because we share articles that fit in with our beliefs and connect with others who have similar views to our own, the news we get becomes recycled based on our previous biases. Here lies the ultimate danger of social media: our news has become so polarized that we lack the exposure necessary for an advancement in society.
The idea of our influences directing us toward belief and action is not new. Gustave Le Bon, one of the greatest philosophers dedicated to the work on crowd psychology, makes the case that since the dawn of time we have always been under the influence of religious, political, and social illusions. He states that because the masses have always been under these influences, we are ingrained to seek out an illusion to grasp to under any and all circumstance. He states that philosophers in the 19th century have worked greatly to destroy these illusions, but have not been able to provide the masses with any ideal that could sway them. Due to this, the masses now flock to whichever rhetorician wets their appetites. It seems social media has become a universal outlet to which we grasp onto our illusions, as Le Bon mentions, refusing to diversify ourselves to viewpoints that differ than our own and thereby narrowing our visions of reality and widening the divisions we have from one another and, perhaps even, from truth. | | \ No newline at end of file |
|
TWikiGuestFirstEssay 15 - 07 Oct 2020 - Main.ClaireCaton
|
|
META TOPICPARENT | name="WebPreferences" |
| |
< < |
The Internet Society’s Nuclear Option
In class, we have discussed the importance of privacy and the risks of surveillance in an era of increasingly sophisticated behavior recording, prediction, and manipulation. As a society, we are becoming increasingly entrenched in a burgeoning ecosystem of surveillance capitalism.
Many agree that a fundamental redirect is in order; the broadly unregulated, widespread capture of behavioral data should be restricted or even prohibited worldwide. Ideally, we might even eliminate all previously collected behavioral information
However, as I reflect upon the current state of the Internet Society, I cannot ignore the nonzero possibility that the war to preserve the privacy of behavioral data and prevent sophisticated behavioral influence has already been lost.
Within Google’s servers alone lay my proudest academic works, intimate secrets from my darkest moments, my tasks for the day, my plans for the year, a scatterplot of my social footprint, an extensive record of my movements, and contact information for every human I know. Facebook, Amazon, and Bank of America hold powerful data profiles of me as well. Add to that datasets compiled by the U.S. government and other state entities.
I write this as a relatively well informed, well educated, and concerned citizen. My dismal tale of ignorant surrender and subsequent inaction is all too common. Around the globe various corporate and government entities hold massive troves of personal information regarding billions of humans.
Unfortunately, the deletion of this behavioral data strikes me as a functional impossibility. Such valuable digital information will not be destroyed by force. Considering the power of the parties who hold it and the existential threat that deletion would present, they will not cooperate either. We must also consider the general lack of support for such action at this time and the logistical difficulties inherent in such an effort. Accordingly, I assume that the behavioral data that has been collected will remain indefinitely.
Next, I consider the possibility that we can limit the capture of behavioral data to its present state.
Even if I completely unplug today, I have already leaked extensive information. The power of this data in combination with present-day tools is evident in societal changes as fundamental as declining sex drive and the swaying of national elections.
With such immense value, behavioral-data-driven tools will continue to advance even in the absence of new data collection.
The best-case scenario appears to be an incremental slowdown of behavioral data collection over several years with significant dissent by parties that are unmoved by widespread concern and have sufficient leverage to withstand external pressures (e.g. Communist Party of China).
Considering these dynamics, I am concerned that a data-collection slowdown may be insufficient to eliminate threats of social control. Accordingly, it seems prudent to consider an alternate plan of action in case of continued progression into a surveillance-centric ecosystem.
Society’s current path is one in which the Parasite with the Mind of God is under construction…or simply undergoing perpetual renovations. Theorists such as Ray Kurzweil and Nick Bostrom believe that society is en route to creating superintelligent artificial intelligence, a digital system that is capable of outperforming humanity in all intellectual endeavors. Such a machine strikes me as the natural conclusion of a society in a feedback loop of data capture for observation, analysis, and influence.
Bostrom further claims that superintelligent A.I. “is the last invention that man need ever make” as it may execute any further self-enhancements and will be sufficiently intelligent to thwart attempts at intervention.
If we continue on this path, we must decide who should be in control of this ultimate project and what procedures will guide the decision-making process.
At present, the frontrunners in the race for big data and sophisticated machine learning seem to be Big Tech and national governments. Neither group embodies the goals or procedures that I want guiding such a project of ultimate importance.
Both are shrouded in secrecy and exist within competitive spaces that cultivate racing behavior. “Move fast and break stuff.” “It’s better to ask for forgiveness that to request permission.” As these tools become more powerful and the societal impact more drastic, such behavior becomes increasingly dangerous.
To avoid a future shaped by today’s likely candidates and their inherent flaws, I advocate the establishment of a socialized multinational AI research project that is subject to public input and oversight and is less constrained by capitalist and political forces. A unified global public project strikes me as the best opportunity to cultivate sufficient resources to surpass the efforts of Big Tech and national governments.
Even if such a project were initiated imminently, the hour is late and the competition is fierce. Thus, drastic action must be considered. Legislation granting data portability rights could be extremely helpful, allowing individuals to obtain their personal data from service providers and, in turn, share that information with the socialized project. Similarly, legislation that protects adversarial interoperability in the software industry could catalyze transitions away from predatory products upon which the public has become dependent. If necessary to achieve competitive dominance, further data collection on a consensual basis may be pursued.
While the collection and processing of behavioral information is inherently risky, an international socialized model may greatly reduce the risks of our present private and national models.
I do not advocate any surrender in the fight for privacy. I simply support the development of contingency plans. An arms race is afoot in both the private and public sector with many convinced that surveillance is the key to future dominance. In humanity’s failure to denuclearize, I see an inability of modern society to relinquish powerful tools of control and I fear that digital surveillance may be similarly destined to proliferate. | | \ No newline at end of file | |
> > | 1st draft of the 1st essay | | \ No newline at end of file |
|
TWikiGuestFirstEssay 13 - 12 Oct 2019 - Main.AndrewIwanicki
|
|
META TOPICPARENT | name="WebPreferences" |
| |
< < | Even before I walk into the apartment where I am babysitting the family is watching me. They’re not home but they see me on the “Ring” and text, “I see the nanny let you in.” Suddenly they appear on their video Alexa without warning and without me answering to explain the bedtime procedures for their 3-year-old. At bedtime she wants to listen to music. Almost immediately her parents have turned it on from their phones. While sitting at a concert 60 blocks south they ignore Billy Joel, instead watching and listening to their daughter and me. | > > |
The Internet Society’s Nuclear Option | | | |
< < | Constant parent surveillance started in my generation. Friends got busted for lying about their whereabouts when their parents tracked their phones. Sneak in after curfew? Good luck. Your phone, the “Ring,” the cameras inside are the nosiest neighbors. For concerned parents the gadgets of the internet age allow for a type of helicoptering like never before. | > > | | | | |
< < | What if we told these concerned parents that with a few lines of python anyone can watch? Or that there are websites listing webcams that are set to the default passwords (or without passwords) that anyone on the internet can access? | > > | In class, we have discussed the importance of privacy and the risks of surveillance in an era of increasingly sophisticated behavior recording, prediction, and manipulation. As a society, we are becoming increasingly entrenched in a burgeoning ecosystem of surveillance capitalism. | | | |
> > | Many agree that a fundamental redirect is in order; the broadly unregulated, widespread capture of behavioral data should be restricted or even prohibited worldwide. Ideally, we might even eliminate all previously collected behavioral information | | | |
< < | Hacking is Easy | > > | However, as I reflect upon the current state of the Internet Society, I cannot ignore the nonzero possibility that the war to preserve the privacy of behavioral data and prevent sophisticated behavioral influence has already been lost. | | | |
< < | Accessing someone’s unsecured webcam isn’t difficult and sites like Shodan and Insecam make this easier. Bots randomly scan for unsecured devices, something that can be done across the entire internet in a matter of hours. If one runs a quick search on Shodan she can find a slew of web servers that use the username and password admin/admin or that can be accessed through a password found by googling “manufacturer default credentials.” These default credentials are conveniently assembled on ispyconnect.com’s “user guide.” Still other cameras can be accessed through known vulnerabilities such as Boa webcams. Boa has a vulnerability that allows you to reset the admin password. In 2015, security firm Rapid tested nine popular baby monitors for security. Eight of the nine got an F, the ninth a D minus. Despite the reporting on this in 2015, nothing has changed. | > > | Within Google’s servers alone lay my proudest academic works, intimate secrets from my darkest moments, my tasks for the day, my plans for the year, a scatterplot of my social footprint, an extensive record of my movements, and contact information for every human I know. Facebook, Amazon, and Bank of America hold powerful data profiles of me as well. Add to that datasets compiled by the U.S. government and other state entities. | | | |
< < | There have been accounts of mothers catching hackers hijacking the cameras. One mother noticed her baby monitor moving without anyone controlling it. She realized it was scanning the room and landing on her bed. Everyone who was supposed to have control was in the same room not moving the device. Others reported their baby monitors talking. One particularly disturbing case involves a hacker yelling at babies on baby cams. | > > | I write this as a relatively well informed, well educated, and concerned citizen. My dismal tale of ignorant surrender and subsequent inaction is all too common. Around the globe various corporate and government entities hold massive troves of personal information regarding billions of humans. | | | |
< < | If peeping Toms on the internet are watching through baby monitors, what comes next? Surely those who lived in Stalin’s Soviet Union would find bringing a device into your home that anyone can access foolish. Even if you aren’t worried about your own government, there is nothing stopping other countries from peeping too. This can allow for more targeted advertising, election campaigning, perfect price discrimination. Even if governments or companies aren’t themselves watching, the dangers of commodification of personal information are real. | > > | Unfortunately, the deletion of this behavioral data strikes me as a functional impossibility. Such valuable digital information will not be destroyed by force. Considering the power of the parties who hold it and the existential threat that deletion would present, they will not cooperate either. We must also consider the general lack of support for such action at this time and the logistical difficulties inherent in such an effort. Accordingly, I assume that the behavioral data that has been collected will remain indefinitely. | | | |
< < | The dangers of these insecure devices goes beyond concerns of creeps or the hypothetical 1984 sounding concerns of the government or companies watching, they can bring down the internet. In 2016 DNS provider Dyn was attacked by Mirai botnets which took down sites including Netflix, Twitter, and Spotify largely using IoT? devices (such as baby monitors) infected with malware. Hackers took complete control of the monitor. Further, baby monitors can grant a hacker access to the home network to get information from computers. | > > | Next, I consider the possibility that we can limit the capture of behavioral data to its present state. | | | |
> > | Even if I completely unplug today, I have already leaked extensive information. The power of this data in combination with present-day tools is evident in societal changes as fundamental as declining sex drive and the swaying of national elections. | | | |
< < | The Law | > > | With such immense value, behavioral-data-driven tools will continue to advance even in the absence of new data collection. | | | |
< < | As is common with the law and the internet, the law hasn’t caught up with the baby monitors. Some have noted the right to privacy should apply here. What is more of a violation of privacy than someone watching you in your bedroom? Seeming natural applications of existing laws don’t go far enough to solve the problem. While applying peeping Tom laws to those watching over baby monitors could prosecute some people and give some justice to victims, avoiding prosecution wouldn’t be hard and it wouldn’t solve the problem. Security experts have proposed other solutions including regulation of baby monitors, allowing victims to sue the baby monitor companies, and hacking back. | > > | The best-case scenario appears to be an incremental slowdown of behavioral data collection over several years with significant dissent by parties that are unmoved by widespread concern and have sufficient leverage to withstand external pressures (e.g. Communist Party of China). | | | |
< < | Security experts have called on the government to get involved by regulating IoT? devices. Mikko Hypponen, chief research officer for F-Secure, for example, compared leaking WiFi? passwords to devices catching on fire: it shouldn’t happen and the government should make sure it doesn’t. Experts have proposed civil and criminal penalties for creating unsecure devices and laws requiring buyers to change the default password before the device can be used. Others, however, believe regulation would be useless because U.S. regulations won’t affect other countries. | > > | Considering these dynamics, I am concerned that a data-collection slowdown may be insufficient to eliminate threats of social control. Accordingly, it seems prudent to consider an alternate plan of action in case of continued progression into a surveillance-centric ecosystem. | | | |
< < | Some have proposed allowing victims of baby monitor hacks to sue manufacturers or sellers of the monitors. The Mirai attack shows the widespread hacking of these devices and suggests the possibility of a class action suit. If companies are hit with hefty fines they would be incentivized to send shoddy security for IoT? devices the way of lead paint. | > > | Society’s current path is one in which the Parasite with the Mind of God is under construction…or simply undergoing perpetual renovations. Theorists such as Ray Kurzweil and Nick Bostrom believe that society is en route to creating superintelligent artificial intelligence, a digital system that is capable of outperforming humanity in all intellectual endeavors. Such a machine strikes me as the natural conclusion of a society in a feedback loop of data capture for observation, analysis, and influence. | | | |
< < | Still others have proposed a more radical solution: hacking back. Rob Graham, security researcher and hacker, suggested the NSA launch a proactive strike to knock compromised IoT? devices offline. Graham sees this as a solution to U.S. legislation being useless overseas. While that may be true, there are likely other Constitutional concerns with the NSA hacking into people’s devices to knock them offline. | > > | Bostrom further claims that superintelligent A.I. “is the last invention that man need ever make” as it may execute any further self-enhancements and will be sufficiently intelligent to thwart attempts at intervention. | | | |
> > | If we continue on this path, we must decide who should be in control of this ultimate project and what procedures will guide the decision-making process. | | | |
< < | Conclusion | > > | At present, the frontrunners in the race for big data and sophisticated machine learning seem to be Big Tech and national governments. Neither group embodies the goals or procedures that I want guiding such a project of ultimate importance. | | | |
< < | This paper discussed the security concerns of hackers accessing baby monitors and what this could mean for commodification of personal data and access by companies and governments as well as widespread attacks. Other concerns with baby monitors go beyond the scope of this paper: children growing up constantly surveilled and the ethics of spying on your babysitter, to name a couple. Parents have begun to worry about sharing about their children on Instagram. A class action suit is currently going against Disney for scraping data from children’s video games. It is time parents become concerned about the safety devices they bring into their homes. | | \ No newline at end of file | |
> > | Both are shrouded in secrecy and exist within competitive spaces that cultivate racing behavior. “Move fast and break stuff.” “It’s better to ask for forgiveness that to request permission.” As these tools become more powerful and the societal impact more drastic, such behavior becomes increasingly dangerous.
To avoid a future shaped by today’s likely candidates and their inherent flaws, I advocate the establishment of a socialized multinational AI research project that is subject to public input and oversight and is less constrained by capitalist and political forces. A unified global public project strikes me as the best opportunity to cultivate sufficient resources to surpass the efforts of Big Tech and national governments.
Even if such a project were initiated imminently, the hour is late and the competition is fierce. Thus, drastic action must be considered. Legislation granting data portability rights could be extremely helpful, allowing individuals to obtain their personal data from service providers and, in turn, share that information with the socialized project. Similarly, legislation that protects adversarial interoperability in the software industry could catalyze transitions away from predatory products upon which the public has become dependent. If necessary to achieve competitive dominance, further data collection on a consensual basis may be pursued.
While the collection and processing of behavioral information is inherently risky, an international socialized model may greatly reduce the risks of our present private and national models.
I do not advocate any surrender in the fight for privacy. I simply support the development of contingency plans. An arms race is afoot in both the private and public sector with many convinced that surveillance is the key to future dominance. In humanity’s failure to denuclearize, I see an inability of modern society to relinquish powerful tools of control and I fear that digital surveillance may be similarly destined to proliferate. | | \ No newline at end of file |
|
|