Law in the Internet Society

View   r3  >  r2  ...
FarayiMafotiFirstPaper 3 - 06 Nov 2011 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"

It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Line: 9 to 9
 

Google, Give us a Peak

Changed:
<
<
Google is now the subject of antitrust scrutiny, having been accused of leveraging its dominance in the “"keyword targeted internet advertising" space (consider this attempt number 1 at defining the relevant market here) to favor its own services to the detriment of a competitive market for the purpose of marginalizing rival companies. Many have argued that Google tweaks its algorithm in ways that push down certain sites in search results that compete for eyeballs, allegedly because these sites have unoriginal content. Disgruntled site owners have cited Google giving Yelp the “TripAdvisor treatment” in support of their accusations. Yelp CEO Jeremy Stoppelman, when questioned about Google Places, remarked:
>
>
Google is now the subject of antitrust scrutiny, having been accused of leveraging its dominance in the “"keyword targeted internet advertising" space (consider this attempt number 1 at defining the relevant market here) to favor its own services to the detriment of a competitive market for the purpose of marginalizing rival companies.

Depends on your definition of "scrutiny." This subject is much discussed by theorists in what used to be thought of as "law reviews," and the FTC engages in occasional throat-clearing. But it was apparent to all knowledgeable observers even before Christine Varney departed US DoJ? for Cravath that there will be no serious dust-up between this Administration and Google in during this or any second term. Similarly, the EC has no political stomach for, and precious little jurisdictional armament with which to maintain, such a prolonged confrontation.

Many have argued that Google tweaks its algorithm in ways that push down certain sites in search results that compete for eyeballs, allegedly because these sites have unoriginal content. Disgruntled site owners have cited Google giving Yelp the “TripAdvisor treatment” in support of their accusations. Yelp CEO Jeremy Stoppelman, when questioned about Google Places, remarked:

 "Google’s position is that we can take ourselves out of its search index if we don’t want them to use our reviews on Places…. But that is not an option for us, and other sites like us – such as TripAdvisor? – as we get a large volume of our traffic via Google search…We just don’t get any value out of our reviews appearing on Google places and haven’t been given an option other than to remove ourselves from search, how to improve this situation."
Added:
>
>
These are two completely different questions being conflated. At the beginning of the paragraph you are discussing the supposed unfairness of Google's algorithmic rankings. Later you are reflecting on a completely different issue; whether a site operator has some reason to object when material that is searchable to Google spiders--and to the entire rest of the universe on the same terms--is re-aggregated by Google in contexts that bring no immediate monetary advantage to the site operator whose page was searched. Whatever analysis is appropriate to each of these questions (about which I suspect in both cases we might differ), there is no appropriate mode of analysis that begins by confusing them.
 

The "Everyone Wins" Artifice

Google’s competitors have bolstered their campaign against Google’s black box algorithms using the pretext of “consumer welfare” – which is defined almost exclusively in terms of consumers reaping the benefits of a fair marketplace that produces real innovation presumably through merit-based competition.
Added:
>
>
This sentence doesn't make any actual sense. I'm not aware, at the moment, of any search engine operator proposing to establish service on the basis of transparently published ranking algorithms. If there were such an operator, it's argument in favor of transparency would not be to consumers, except in the general sense that all propaganda in a society where consumption represents the predominant share of GDP is conducted in the meaningless bullshit vocabulary of "consumer welfare." Slight analysis reveals that it's not the consumer of search results who cares whether the algorithm is transparent. The very people who think that free software is unimportant because the bulk of ordinary PC consumers don't use it should see at once why this argument is nonsense. Transparent search algorithms are valuable to prosumers of search: that is, to the creation of a system of federated search technology, in which we all do searching for one another in some fast and efficient manner we haven't devised yet, and no party in the world knows what all the rest of us are thinking about.

But such a system, in which we spread search out throughout the Net, deconcentrating it, would remove the power over advertising that is held by a centralized search engine that knows what lots of people are looking for right now. In other words, the real competition in search is not between parties centralizing search to control advertising, for each of whom ranking algorithms that enable auction markets are quite legitimately and necessarily hypervaluable trade secrets, but between the architecture of centralized search and the architecture of federated search. As I've already explained in class, centralized search architecture—which makes Google, Bing, and almost all the other known variants—has what at present seems an insurmountable lead. The one-way link design of the Web is apparently overwhelmingly in its favor. And no one has yet built a model of successful, general, instantaneous, Web-wide federated search, so there is not even an unviable competitor to the reigning architecture.

On a twenty-year time scale, however, that's not true. The Web is less than 7,000 days old now. When it is 15,000 days old, federated search will be the dominant architecture, and ranking algorithms will be transparent, just as operating system kernel implementations are becoming transparent, because they too will be free software.

That has important implications for the future of advertising, and the distribution of economic power, that are probably the big story here. But you can't find them if you don't look for them.

 

An Overview of Online Search

Google's sponsored links are produced for businesses interested in advertising and willing to pay Google when users click on their ads. Advertisements are generated by the keywords a user enters into Google's search engine. The amount that Google charges for sponsored links is calculated according to a keyword auction conducted through Google's AdWords? platform. These auctions are automated based on a set of parameters specified by each advertiser, and they occur instantaneously each time a keyword is entered into Google's search engine. An advertiser who places a higher bid for a keyword will receive better placement of its advertisements when a user enters that keyword as part of his search. Additionally, Google employs an innovative quality metric that adjusts the placement and cost to the advertiser of sponsored links based on the links' relevance to the search query and the quality of the underlying webpage. The heart of the dominant theory of Sherman Act, Section 2 liability against Google relates to Google's use of quality scoring in influencing the outcome of its AdWords? auctions. The quality score employs advanced algorithmic technology to maximize the relevance of search results and thus the value of the search engine to users, the likelihood of revenue-producing impressions to advertisers, and revenue to Google. Allegations of anticompetitive conduct surrounding the quality score turn less on its existence--all major search engines use quality scores to improve the relevance of search results—and more on its arcane nature. The specific determinants of quality scores are kept hidden by design and Google has repeatedly justified this opacity with the obvious axioms: (a) No company wants to share its secret formulas with its competitors; (b) by making the ranking formulas too accessible, it would be easier for people to game the system.
Added:
>
>
There's no "dominant theory" of Google's antitrust liability because there's no viable theory. Not one serious party, public or private, has decided to make the vast bet required to test any theory, because there is no theory that demonstrates both harm and causation sufficient to invoke legal process by any party with a claim to propose.

The other problem here is that you aren't discussing ranking algorithms anymore, you're discussing advertising placement algorithms for sponsored links. These are not the same things, not the same algorithms, and not the same analytic issues with regard to transparency. You need to be clear from the beginning that you're not asking anything about search, the purpose of which is to find things in the web that the reader wants to see. You're asking about advertising placement, which is the algorithm matching stuff that wants to be seen to people who could be induced to click on it whether they actually want it or not. There is no analytic case whatever for transparency of the latter algorithms, except the general belief that people should be able to stufy the software they interact with, which is too weak a claim to overset the obvious operator interest in trade secrecy.

 

What a Transparent Google Might Look Like

Some argue that now is the time for Google to admit that its power in the ecosystem is so great that the company owes it to the rest of the ecosystem to become more transparent in how it ranks sites. One outcome of this arrangement would be the creation of an independent, transparent, quality scoring system. This system would simply rate pages and search results and publish a Klout-like score. The details of the report would be open for everyone to see and scrutinize. Popular dot-com era entrepreneur and blogger Jason Calacanis has also called for Google to provide a calendar of algorithm updates and the informing of the ecosystem of its plans.
Added:
>
>
But that's about search ranking. The case is weak, but you're not discussing it. Are you? The confusion on technical points is harmful even to basic coherence at this point.
 

The Case for Pandora’s Box

From an antitrust perspective, no business, even a monopolist (assuming that Google is one and Google has refused to deal in lieu of short-term profits), has an antitrust duty to reveal to competitors formulas that it uses to set prices (see http://newscenter.berkeley.edu/2011/06/07/digital-democracy/ - the quest for transparency on the internet in general is a red herring). Moreover, courts are skeptical to intervene on the basis of complaints about product design by rivals because of the presumption that such intervention will chill innovation.

Revision 3r3 - 06 Nov 2011 - 16:39:04 - EbenMoglen
Revision 2r2 - 24 Oct 2011 - 22:30:05 - FarayiMafoti
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM