Eli Pariser talks about the potential pitfalls of personalisation in this Ted Talk – The Filter Bubble. In essence he’s saying that personalisation has the ability to create an echo chamber in which we are never exposed to views that some algorithm somewhere has decided we might not like.
Personally speaking, the fact that Google is no longer absolutely objective doesn’t particularly phase me. But then I was weaned on post-Marxism and critical theory so I kind of dig the whole radical subjectivity thing. And the reality is that it is pretty easy to break out of this bubble. If I want to go to the Daily Mail I can, and even if they’re filtering out the really right wing stuff because they’ve somehow identified me as some kind of wishy washy liberal then I can live with that. If I don’t Google to know who I am I can log out (Pariser skirts around this issue by talking about user agents being used as filters, but really how much is Google filtering on the basis of whether I use Firefox or Safari? I’m sure it’s placing much more weight on my logged-in profile and browsing history).
And actually what I really want is more personalisation. My big problem with Facebook is not that it filters the Wall, it’s that it filters the wrong things.
What I want are better filters. I want to get home, turn on a machine that has predicted my every content need and serves it to me. I don’t want to spend my precious free time working out what might be worth watching. I want to know what my best friends and family are doing and the films, TV programs or music that I will like to listen to.
However, despite Pariser’s not being a particularly compelling thesis in my opinion, he’s right to raise the issue. It’s important to understand that this is happening, to know that you may have to try a little harder to find views, opinions and content that you won’t really like.