This is the fifth post in a series, sharing some reading I’ve been doing lately. This post is about the following book:
The filter bubble: How the new personalised web is changing what we read and how we think.” By Eli Pariser.
A bit of a summary:
Pariser writes about how sites like Google and Facebook (amongst others) filter out results that their algorithms deem irrelevant to you. There is no longer a single Google – if two people search for the same thing, we will both get different results, depending on our previous online activity. These algorithms are called ‘personalisation’ and allow vast swathes of content to be personalised to you. This can be incredibly useful when you want to narrow down a search for a fact or a local restaurant’s name and so forth, but can have the effect of narrowing your view of the world. Instead of seeing a wide variety of ideas you tend to see your own views echoed back at you.
All these combined personalising algorithms are called the Filter Bubble – an online world that is totally unique to you. The effect of this is that we don’t come across enough ideas that rub us up the wrong way, challenge our ideas and spark creativity. It also means that “by illustrating some possibilities and blocking our others, the filter bubble has a hand in your decisions. And in turn, it shapes who you become.” Which is, Pariser says, a single identity which doesn’t differentiate between the ‘you’ in different contexts, times and aspirational versions of yourself.
Pariser also discusses the way in which this information about you is translated into money, through sharing your data with advertisers. In addition to this, Pariser talks to us about the kind of content we tend to see, which may be more sex, drugs and rock and roll than important world issues. This taints our view of the world – “if television gives us a ‘mean world,’ the filter bubble gives us an ‘emotional world’.” And who is deciding what we should see? Not a human with ethics (or lack thereof) but a program. The problem is, Pariser says, not so much that certain ideas are hidden from us, but that we have no control over what is hidden and what isn’t, so we don’t know what is hidden.
Pariser concludes the book with some predictions for the future and some recommendations that basically put some of the control back in citizens’ hands.
My response to/thinking about this text:
Firstly, I was glad Pariser had written a book that made sense to a non-techie pleb like me! After reading this book, I must say the thing that concerned me the most was the idea that someone should be curating all the worthwhile stuff to read on the internet. Who is this someone? How do they decide what is noteworthy and what is not? Is this the same for everyone, throughout the globe? It was then that I came across Pariser’s site “Upworthy” which aims to “help people find important content that is as fun to share as a FAIL video of some idiot surfing off his roof.” I hoped that Pariser himself wasn’t the one to do the curating and he’s not, it’s a collaborative effort of, on the day of writing, about 4 people. I guess the idea just needs to catch on, to harness great diversity in ideas and opinions.
I also wonder whether it is really just about being aware of this and being a little bit clever about our internet usage and carefully crafting your digital footprint. I know people don’t like techie things being called tools, but isn’t it just like knowing when to read Woman’s Day versus Time magazine, as well as seeking out broad perspectives?