Felix Stalder on Tue, 1 Jul 2014 15:25:12 +0200 (CEST) |
[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]
<nettime> Corrupt Personalization |
It's well worth the read the entire post, since there are a lot of examples and funny scare stories. By now, a lot of people report that they dislike Facebook (the most extreme of the social mass media), but are still using it, because of fear of missing something. This personally felt contradiction -- doing voluntarily something you don't like -- is structural network power in action. Felix http://socialmediacollective.org/2014/06/26/corrupt-personalization/ <....> More generally, I think the danger of corrupt personalization is manifest in three ways. * Things that are not necessarily commercial become commercial because of the organization of the system. (Merton called this "pseudo-gemeinschaft," Habermas called it "colonization of the lifeworld.") * Money is used as a proxy for "best" and it does not work. That is, those with the most money to spend can prevail over those with the most useful information. The creation of a salable audience takes priority over your authentic interests. (Smythe called this the "audience commodity," it is Baker's "market filter.") * Over time, if people are offered things that are not aligned with their interests often enough, they can be taught what to want. That is, they may come to wrongly believe that these are their authentic interests, and it may be difficult to see the world any other way. (Similar to Chomsky and Herman's [not Lippman's] arguments about "manufacturing consent.") There is nothing inherent in the technologies of algorithmic allocation that is doing this to us, instead the economic organization of the system is producing these pressures. In fact, we could design a system to support our authentic interests, but we would then need to fund it. (Thanks, late capitalism!) To conclude, let's get some historical perspective. What are the other options, anyway? If cultural selection is governed by computer algorithms now, you might answer, "who cares?" It's always going to be governed somehow. If I said in a talk about "algorithmic culture" that I don't like the Netflix recommender algorithm, what is supposed to replace it? This all sounds pretty bad, so you might think I am asking for a return to "pre-algorithmic" culture: Let's reanimate the corpse of Louis B. Mayer and he can decide what I watch. That doesn't seem good either and I'm not recommending it. We've always had selection systems and we could even call some of the earlier ones "algorithms" if we want to. However, we are constructing something new and largely unprecedented here and it isn't ideal. It isn't that I think algorithms are inherently dangerous, or bad -- quite the contrary. To me this seems like a case of squandered potential. With algorithmic culture, computers and algorithms are allowing a new level of real-time personalization and content selection on an individual basis that just wasn't possible before. But rather than use these tools to serve our authentic interests, we have built a system that often serves a commercial interest that is often at odds with our interests -- that's corrupt personalization. If I use the dominant forms of communication online today (Facebook, Google, Twitter, YouTube, etc.) I can expect content customized for others to use my name and my words without my consent, in ways I wouldn't approve of. Content "personalized" for me includes material I don't want, and obscures material that I do want. And it does so in a way that I may not be aware of. This isn't an abstract problem like a long-term threat to democracy, it's more like a mugging -- or at least a confidence game or a fraud. It's violence being done to you right now, under your nose. Just click "like." -- ||||||||||||||||||||||||||||||||| http://felix.openflows.com |OPEN PGP: 056C E7D3 9B25 CAE1 336D 6D2F 0BBB 5B95 0C9F F2AC # distributed via <nettime>: no commercial use without permission # <nettime> is a moderated mailing list for net criticism, # collaborative text filtering and cultural politics of the nets # more info: http://mx.kein.org/mailman/listinfo/nettime-l # archive: http://www.nettime.org contact: nettime@kein.org