Law in the Internet Society

View   r8  >  r7  ...
BahradSokhansanjSecondPaper 8 - 17 Jan 2012 - Main.BahradSokhansanj
Line: 1 to 1
 
META TOPICPARENT name="SecondPaper"
Changed:
<
<

We Are All Prometheus

>
>

We Are All Prometheus Now

 
Changed:
<
<
Ready for review. The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer. It’s a really great talk that expresses much more clearly the ideas that have been bouncing around my own head for a long time now. I strongly recommend watching it -- certainly over actually reading what I wrote here!
>
>
Ready for review. The ideas in this essay crystallized after watching Cory Doctorow’s recent lecture, The Coming War on the General Purpose Computer, which I strongly recommend.
 
Changed:
<
<
We believe that in a free society, government enforces laws that may restrict actions, based on the need to protect safety and social order. We may expect that there be limited prohibitions on reading and listening -- but only in extraordinary circumstances, tied to what we think will keep us safe, whether related to fears of terrorism or concerns about child pornography. But, our moral intuition is that the freedom to do can be curtailed, but for us to be free, our government can't punish thought itself.
>
>
We believe that in a free society, government enforces laws that may restrict actions, based on the need to protect safety and social order. We may expect that there be limited prohibitions on reading and listening -- but only in extraordinary circumstances, tied to what we think will keep us safe, whether related to fears of terrorism or concerns about child pornography. And yet, our moral intuition is that the freedom to do can be curtailed, but for us to be free, our government can't punish thought itself.
 
Changed:
<
<
But, computers challenge our ability to differentiate between a law that infringes the freedom to do something with the freedom to think about it. Computers are now the way we acquire and transmit knowledge. Computers can be combined with 3-D printers to manufacture physical objects and devices. Computers can run DNA synthesis machines and engineer microorganisms. Laws can be enforced to prevent the use of computers to copy movies, build counterfeit or dangerous goods, or produce patented or dangerous microorganisms. But, these laws will necessarily punish thought.
>
>
Computers challenge our ability to differentiate between a law that infringes the freedom to do something with the freedom to think about it. This matters because computers are now the way we acquire and transmit knowledge.They can be combined with 3-D printers to manufacture physical objects and devices. They can run DNA synthesis machines and engineer microorganisms. Laws can be enforced to prevent the use of computers to copy movies, build counterfeit or dangerous goods, or produce patented or dangerous microorganisms. But, how compatible are these laws with what we think is a free society?
 
Changed:
<
<
When we think about computers, we don't usually think about what computers actually are. We usually think about what computers can do. What can software that runs on a computer do? What can we do on the Internet we access through the computer? The computer is just a passive entity, largely invisible and transparent. We don't even call most computers, "computers." The word isn't found in the term "smartphone." We usually drop off the last half of the bulky phrase "tablet computer;" Kindles and Nooks are "e-readers." Playstations are "game consoles," even when physically and functionally indistinguishable from desktop PCs, and we don't even think about the computers in Blu-Ray players and inside cars. But, these are all programmable, universal computers.
>
>
When we think about computers, we don't usually think about what computers actually are, just what they do -- the software they run or the content they display. The computer is just a passive, invisible entity. We don't even call most of them "computers." We use words like "smartphone," or “tablet” instead of “tablet computer.” Kindles and Nooks are "e-readers." Playstations are "game consoles," even though they are basically desktop PCs, and we usually ignore the computers in Blu-Ray players and inside cars. But, these are all programmable, universal computers.
 
Changed:
<
<
Universal computers are special, because they can execute any algorithms. Algorithms are just thoughts that have been broken down to pieces, a set of process and rules that can be described using logic. They are limited only by the speed of the circuitry to run through the algorithm's instructions and its capacity to store data produced and used by the algorithm work.
>
>
Universal computers are special, because they can execute any algorithm. Algorithms are just thoughts that have been broken down to pieces, a set of process and rules that can be described using logic. What algorithms computers can run is limited only by the speed of their circuitry and capacity to store data. Computers are "thinking machines," even though that’s a concept that usually comes up in exotic, metaphysical discussions of artificial intelligence and silicon consciousness, the stuff that Kurzweil writes about. The reality of computers seems much more mundane; they just follow concrete, logical instructions. But, computers are already thinking for us, if not exactly like us. Computers execute our thoughts, or someone else's or a collective's thoughts, and then display the results.
 
Changed:
<
<
Computers are "thinking machines," a concept that usually comes up in metaphysical discussions of artificial intelligence, contemplation of an era of computers that can think creatively like we do, and even be conscious, like science fiction robots or Kurzweilian spiritual machines. The reality of computers seems much more mundane; they just follow concrete, logical instructions. But, computers are already thinking for us, if not exactly like us. Computers execute our thoughts, or someone else's or a collective's thoughts, and then display the results.
>
>
The "Information Age" is characterized by the word "information." This is interesting, because information is a long, Latin-rooted word. “Information” is a word that removes itself intellectually from our living experience. "Knowledge" means basically the same thing, but it's not used as much. This is because "Information Age" is basically a marketing device, used to sell people on the idea that money can be made by buying and selling information. The word "knowledge" is bound up with "knowing," to human thought. Commercializing “thought” would be a tougher sell. We intuitively recognize that to control the marketplace of thought, means controlling thought itself. That’s actually the basis of marketing, really, but we don’t like to think about what that implies, so we prefer the word “information.” But the choice of word can’t avoid reality.
 
Changed:
<
<
The "Information Age" is characterized by the word "information." Information is a long, Latin-rooted word that puts itself at an intellectual remove, as a concept floating outside our human experience. "Knowledge" means basically the same thing, but it's disfavored. This makes sense. The "Information Age" is basically a marketing device, used to sell people on the idea that money can be made by buying and selling information. The word "knowledge" is bound up with "knowing," to human thought. Commercializing thought is a tougher sell. To control the marketplace of thought, means controlling thought itself, which is practically difficult. Information is a less troublesome term that looks a lot better on a prospectus. The problem is, no matter what its called, the same basic reality applies.
>
>
The problem is that the information the eager Information Age marketer sells is translated into a series of logical processes, run through a universal computer, and turned into numbers that can be stored and displayed. A universal computer can run any algorithm with which it is programmed. Duplicating what it has stored in its memory, even when it’s only cached there temporarily, is really easy. This means that profits can’t be extracted from the scarcity of information.
 
Changed:
<
<
The problem is that the information the eager Information Age marketer sells is translated into a series of logical processes represented by numbers that are sent through a universal computer. And, a universal computer is, well, universal. It can run any algorithm with which it is programmed. Duplication of the stuff stored in a computer's memory is really easy. So, it is impossible to make money based on the scarcity of information.
>
>
In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to control it. But, these are consistently foiled again and again. Universal computers can run the algorithms that defeat the restrictions, because they have to be leaky for the information to be distributed and read by paying customers. Information sellers respond by developing restrictions that are increasingly fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reports on unauthorized access when a computer goes online, or even shuts computer’s operating system and ability to function entirely. As Cory Doctorow says, "digital rights management always converges on malware." This is especially common in computers that are marketed as smartphones and tablets, or embedded in systems like DVD players, a de-functioning disguised by avoidance of the word “computer.”
 
Changed:
<
<
In an attempt to make the information artificially scarce, sellers have tried increasingly sophisticated mechanisms to restrict access. But, these are consistently foiled again and again because, universal computers are universal -- and they can be programmed with the algorithms to defeat the restrictions. In response, the restrictions have been getting more and more fundamental to the operation of the computer. For example, software can be silently installed in computers that secretly reporting on violations of access restrictions when a computer goes online, or even shuts computer’s operating system and ability to function entirely. Cory Doctorow says, "digital rights management always converges on malware." This de-functioning is especially common for computers that are marketed as smartphones, tablets, and the innards of DVD players -- the marketers try to disguise this by avoiding the word "computer."
>
>
Anything thought builds though, thought can undo. All the most sophisticated means of locking up information can be broken. The knowledge of how to circumvent can be restricted by punishing people who come up with the algorithms, censoring the websites that publicize them, and watching those who seek them. But, an algorithm running on a computer can go around all these measures. All it takes is knowledge and thought.
 
Changed:
<
<
Anything thought builds; thought can undo. All the most sophisticated means of restricting access can be circumvented. The knowledge of how to circumvent can be restricted by punishing people who come up with the algorithms, censoring the websites that publicize them, and watching those who seek them. But, even these measures can themselves be circumvented with an algorithm and a computer. All you need is knowledge and thought.
>
>
This is why copyright law in the digital age is inconsistent with what we think of as being a free society. Enforcement means making circumvention illegal, and that means limiting thought, punishing it when it goes out of bounds, all to support an already obsolete business model. Not to mention that the enforcement can’t prevent anything. It can only go after people after the fact, after the locks have been broken, and information runs free.
 
Changed:
<
<
This means that opposing copyright law is easy. To enforce it in the digital age inevitably means punishing thought about how to circumvent. It means supporting an already obsolete business model is worth putting a penalty on certain kinds of thinking. No economic argument for copyright could possibly win out over preserving the basis of free society. Creating an infrastructure that can be used to regulate or punish any thoughts, not just the duplication of artistic work, is clearly overkill.
>
>
There will be harder questions as computers do more stuff, and we try to stop people from 3-D printing and synthesizing microbes in ways that we fear will harm safety and social order. But, when we try to restrict the use of computers to do these things, we need to recognize first, that our countermeasures will fail, and second, all we can do is punish anyone caught after the restrictions are already broken. And, in weighing the laws we want as punishment, we need decide how enforcing them sacrifices our own personal freedom to think, in what we believe is a free society.
 
Changed:
<
<
There will be harder questions as computers do more stuff. We need to recognize that restricting what the computer does will come at the cost of our personal freedom to think. And, any technical restrictions on computers, no matter how clever, cannot actually prevent anything that can't be circumvented. They can only be used punish the thoughts of those who try to circumvent them and get caught -- and challenge what we believe it is to be a free society.

-- BahradSokhansanj - 12 Jan 2012

>
>
-- BahradSokhansanj - 17 Jan 2012
 
 
<--/commentPlugin-->
\ No newline at end of file

Revision 8r8 - 17 Jan 2012 - 23:50:06 - BahradSokhansanj
Revision 7r7 - 16 Jan 2012 - 13:50:39 - BahradSokhansanj
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM