Monday, January 2, 2012

Information as junk food: life in the 'filter bubble'

The Filter BubbleThe Filter Bubble by Eli Pariser


Eli Pariser's concept of the filter bubble is a metaphor: he fears we're increasingly exposed only to information that software has chosen for us in response to how we have behaved online in the past. So, we live in a bubble of our own making that filters incoming stimuli, leaving only those that we are likely to respond to.

On the face of it, that's a good thing. Companies like Google, Facebook and Amazon are giving us what we want. The companies boast that advertising is more 'relevant' when it's filtered for us. Amazon's suggestions of what we might like to buy are an obvious result of our previous searches and purchases. But there's more to it than that. Pariser says ordinary search results on Google are also filtered according to our individual online histories.

His starting point is a little-noticed Google announcement in December 2009 that its search results would be personalised, using information gleaned from previous searches and location, among other clues. Pariser asked two friends to test this out, comparing the results they got for the search query "BP". The two sets of results were "quite different": 180 million results for one search, 139 million for the other. One had more investment information in the results; the other had more on the recent Deepwater Horizon oil spill.

A bit spooky. But Pariser's point is that it's not good for us to live in a filter bubble: "more and more, your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click." We're just reinforcing our prejudices, seeing the kind of world we'd like to believe in, failing to learn from the unplanned exposure to new ideas and information that, say, a newspaper offers.

I found the most compelling account on the problem in the words Pariser quotes from sociologist danah boyd (sic). She is interested in the information diet we live on: it can be healthy - varied, unmediated -, or it can cater to our worst instincts - "gossip which is humiliating, embarrassing, offensive". The danger of the filter bubble is that "we're going to develop the psychological equivalent of obesity."

It's a great comparison: if we allow ourselves to be seduced by the shortest, most personal, easiest messages, we must surely find it hard to make the effort to stretch our imaginations into fields we don't normally frequent. It would be ironic if at the exact time in human history that almost all information is available to us at almost no cost, we collectively shut ourselves off from all but the most immediately satisfying in the short term.

Is this the MacDonaldisation of information?

Pariser's book is essentially a polemic on these dangers. It is filled out with some background on human psychology, tech history and the elite group of investors and futurologists who are most influential in how the internet is developing. It's all well done, but I couldn't help feeling that Pariser never quite matches the sizzling argument of his Introduction and first chapter. In fact, to understand the business incentive behind the filter bubble, it's hard to improve on the word he quotes at the start of his first chapter, from Andrew Lewis:

"If you're not paying for something, you're not the customer; you're the product being sold."

Worth remembering, and for more on that, see also the excellent You are Not a Gadget by Jaron Lanier.


View all my reviews

No comments:

Post a Comment