|
< < |
META TOPICPARENT | name="FirstPaper" |
|
> > |
META TOPICPARENT | name="SecondPaper" |
|
| |
|
< < | Undermining the Data Mining |
| |
|
< < | Limiting factors on the enforceability of copyright claims have always included the legitimate uses of works as envisioned by the Copyright Act, including for archival purposes as mentioned in cases like Vault v. Quaid Software. In light of recent proposals for extensive archiving for the sake of "fighting child pornography," perhaps we can take a cue from Sweden's experience with the subject. Sweden’s National Library has a legal obligation to collect everything printed in Sweden or in Swedish, including pornographic magazines, even those with child pornography, which got it into a bit of a scandal that is still developing. Because U.S. law criminalizes production, distribution and possession of child pornography, perhaps surveillance could be undermined by requiring ISPs to not only retain logs but the actual content of the transmissions, which could potentially subject them to liability for possessing such illicit content. In all likelihood, ISPs would gain some kind of further immunity in any legislative deal that was struck, but maybe the ISPs are already storing transmissions somewhere, and could be prosecuted for possessing some child pornography even without any knowledge. |
> > | Note: I know this paper is somewhat misplaced within the wiki, but there's no folder for second papers yet.
-- RickSchwartz Apr 25, 2009 |
| |
|
< < | I haven't yet investigated whether or not ISPs would already receive protection for this sort of violation under Section 230 of the Communications Decency Act. From memory, I believe that section only immunizes the ISPs from being treated as speakers or publishers of the transmitted speech, which might receive different legal treatment if the legal standard of the substantive offense only required possession. I will update this with my findings soon. |
> > | Privacy ReMinder |
| |
|
< < | Please note that I am not sure whether or not this will actually develop into a fully-fledged second paper, but would like to explore the idea a little bit and invite others to comment freely without worrying about trampling on the idea as it is being developed. |
> > | Self Regulaxation |
| |
|
< < | -- RickSchwartz - 09 Apr 2009 |
> > | Privacy policies have been ineffectual restraints of data collection and use because of the inherent tension with the freedoms of contract and information gathering. The FTC only takes action against actual noncompliance with whatever terms a data collector chooses to set forth in its own privacy policy, and does not allow a private right of action for breach. The environment of self-regulation has encouraged the use of vague and equivocal language designed to avoid creating obligatory and therefore enforceable duties, while simultaneously impressing upon the unsophisticated consumer an appearance that consumer privacy is taken seriously. Perhaps over-optimistically, the notion that companies even nominally care about privacy might indicate that consumers would react adversely to an explicit disregard for the privacy of sensitive information. Though most trends suggest widespread user apathy with regard to the collection and use of sensitive information, the recent spate over Facebook's Terms of Service suggest that a critical mass of vocal and discontented users could incite some reform or at least some superficial dialogue and attention on the issue. In communities reliant on their user-bases, reform may be more substantial. For example, though the issue did not concern privacy, Digg's HD-DVD encoder key controversy suggests that a sufficiently large user revolt can induce an alternative course of operations. |
| |
|
< < | The crime as laid out in 18 USC 2252(a)(4) is to "knowingly [possess]." The .se National Library case is an interesting wrinkle where you might actually find knowledge (although I know nothing about child pornography laws in Sweden), but I'd think that "dumb" archival by ISPs wouldn't be enough. |
> > | Common Privacy |
| |
|
< < | 47 USC 230(e)(1) specifically disclaims any impact of § 230 on the criminal law. |
> > | Some have suggested creating standardized privacy policies in a Privacy Commons, emulating the eminently accessible style of Creative Commons, that would create enforceable duties. Though this might increase the transparency of the policies used by a data collector that chooses to adopt such a privacy policy, that impulse itself would not incentivize more the adoption of robust privacy policies if people remain apathetic about the privacy options available to them or the impacts of such privacy policies. Without some effect on users' preferences, a Privacy Commons is unlikely to succeed, given the ignorance of privacy concerns to which users are already accustomed. |
| |
|
< < | -- DanielHarris - 10 Apr 2009 |
> > | Furthermore, self-selection would limit likely adopters of such policies to data collectors that would have given more protection in any event. Some suggest mandatory adoption of one of a range of standardized policies, but unless one of those policies allowed the flexibility of information collection and use of the status quo, freedom of learning and thought would probably be undesirably impaired. Furthermore, the multifaceted approach required to accommodate the myriad methods of data collection would make the creation of standardized and technologically relevant policies particularly difficult, especially when the technology will necessarily evolve faster than the policies. |
| |
|
> > | Going even further, some would prefer the creation of machine-readable privacy policies combined with user-controllable metadata that would dictate acceptable information use and collection beforehand. An existing subscription list for AdBlock called EasyPrivacy and extensions like BetterPrivacy are the closest this idea has come to fruition, though they only block processes run by the user's browser, which does not fully eliminate the scope of data miners surveillance.
PrivacyMinder
Since the FTC won't realistically require anything other than self-imposed privacy obligations, users must demand that sites adopt real obligations as a condition of use. One step in the right direction would be the invention of a browser extension (that I would proposedly name "PrivacyMinder") that would prominently display, in easy-to-understand terms or iconography (again, like Creative Commons), the kinds of data collection and uses the currently-viewed website performs according to its privacy policy (or otherwise known facts about its data use). Privacy policies deliberately obfuscate their own terms in order to disincentivize all but the most ardent investigations into their terms; this extension would eliminate some transaction costs of parsing legalese and putting otherwise overlooked terms of use to the fore. The extension would also ideally incorporate information about the terms of use to which users must agree. Just as Firefox displays whether the browser is on a secure server within the address or status bar (and Firefox could just as easily and prominently display whether or not a website is storing cookies), the end product would hopefully give users a more automatic and intuitive understanding of the degree to which websites invade user privacy. A change in attitudes on the demand side should encourage a more bilateral dialogue about the terms a website chooses to set for itself. And, however unlikely it is for a court to hold that a unilateral act constitutes acceptance of those terms, such an extension could include a pop-up asking for affirmative assent to privacy-protecting policies before continuing to browse in order to attempt to make browsewrap binding.
The icons and principles already developed by Mary Rundle or Aaron Helton would be a decent starting point for Privacy Commons, though they are a bit tame as they currently stand. Perhaps the non-judgmental attitude these icons currently would reduce resistance to the extension and even induce more cooperation by sites attempting to get favorable ratings. Whatever icons are used should also be color-coded in a traffic light style or otherwise graded in order to indicate the extent of any use or collection. Furthermore, a default icon for mealy-mouthed language that doesn't do anything for privacy protection (i.e., "no protection granted") ought to be jarring enough to remind users to be careful of activity performed on that site.
In the absence of machine-readable or standardized privacy policies, the extension could subscribe to a list containing manually-generated assessments of which protections, or lack thereof, every domain's privacy policy triggered in the same way AdBlock subscribes to a list containing ad servers to block. The subscription would be created and updated collaboratively through a wiki or other moderated community (perhaps EPIC?), and if the database lacked information for a given domain, or the community found the policy to be too equivocal, the default display would indicate no protection of data and potentially unlimited collection and use. Triggering the default icon might incentivize sites to adopt standardized privacy policies that the extension would automatically recognize as corresponding to various levels of protection. |
| |
|
< < | The Swedish case is great, but I feel as though trying to apply it to storing data on the internet could be tilting at windmills. I don't know if it would ever be feasible to store all transmissions on the internet - the space requirements would be beyond vast, and the volume of transmissions seems to be increasing at least as fast as hard drive storage space. |
| |
|
< < | Also, I feel as though ISPs stored child pornography "unknowingly" for years via usenet. The governments solution to that problem was to enter into "voluntary" arrangements with the ISPs to block usenet... I know various vague legal threats were made, so you might want to check into that: http://news.cnet.com/8301-13578_3-9964895-38.html |
| |
|
< < | -- TheodoreSmith - 10 Apr 2009 |
|
\ No newline at end of file |