|
> > |
META TOPICPARENT | name="WebPreferences" |
The Trouble with Customization
A New York Times headline that caught my eye today says (and I paraphrase) that you can mute the views of people you disagree with a single click, proceeding to ask, but should you? The article talks about Facebook’s “unfollow” feature, I gather. The strange social construct that results from an “unfollow” is that you can save yourself from being exposed to the words of someone you put out as a “friend.” Taken in isolation, the feature is already quite strange, to me at least – either you stay connected to someone or you don’t, and taking in the information and views they share is part of the connection. If your disagreements with a friend got so bad you could not listen to a word they say, that would be a real-life unfriend. Right?
More troubling, however, is the fact that Facebook can make our decisions to distance ourselves from certain people, and to isolate ourselves from their views, without any conscious input from us. Customization comes not only in the form of ads that seize on some aspect of your public profile in an eerie way (Got law school debt? Want a “Massachusetts in New York” sweatshirt?) but also in an algorithmic shuffling and reshuffling of the posts we see based on our likes. Here the data gathering operation goes from watching carefully what you take in to pretty much determining what you take in.
OK, it’s my choice to use Facebook, and I should have shut it down a long time ago. Yet the same dynamic has been observed as well with the search engine of choice for many, Google, which serves up a different set of results to different people based on the same query. This is a little more serious, I think, because I can get a totally different spin on current events (and probably only one spin) than someone else. Cue the redoubling of divergent viewpoints, and unfollowing on Facebook.
So we get what the companies would likely refer to “customized” newsfeeds and “customized” search results. Customized for what purpose? Companies like Facebook and Google benefit immensely form the detailed information they have about us, yet if they know us, or purport to know us, well enough to construct a selection of what we might like to see, and if we already get, and might be prepared to accept, targeted advertising, why narrow more?
Perhaps, by seeking to place us in standardized “molds,” measured by various markers, Google and Facebook can streamline marketing placement. If so, they should be clear about their changing variants on tailoring. Ideally, they should allow users to opt out. It is important to note, though, that the cookie-cutter model of sorting users would probably eventually lead to less accurate results – after all, Google and Facebook are constructing avatars based on what they think they know about us, and reinforcing any possible errors through the limited set of information they allow us to give our feedback to.
Categorization may, in a more sinister twist, better help these two giants identify suspicious activity or persons of interests to the government. However, tailoring the information all users receive to weed a fairly small number of users out is an unjustified and unreasonable tactic, unlikely to bear fruit in the form of precise results. As we have discussed in class, data mining has the potential to change the nature of election, and it may be in this arena that Google and Facebook seek payoff in lieu of keeping us fully informed. In this case, the categorization of what people like or don’t like (in the companies’ view) and the filtering of the information they receive to reflect this, helps the companies determine who will likely vote for whom (or vote at all), and then reinforce that prediction with targeted sets of information.
The possible explanations mentioned above characterize the content customization done by Google and Facebook as a limited-purpose, narrow-interest undertaking. Whatever the specific motivation of the companies may be (their general purpose, we can assume, is profit) data customization is not the solution. Had the companies been seeking to attract us as users, and consumers of advertising, with this tactic, they have likely turned away many people through their act of deciding for these users what it is they would like to see/read/know. Had they simply been driven by the interests of those who pay for advertising, or for data, they went a step too far by trying to construct online clones of users, and deciding for them their preferences in the future. Before we slip into oblivion and forget the time when we could have access to a healthier range of materials and viewpoints it is imperative for us to reclaim transparency or turn to other channels of research and communication. After all, without our further use of Google and Facebook’s services, use there will be no more insidious customization. |
|