Previously, I have written extensively about the filter bubble phenomenon described by Eli Pariser in his book “The Filter Bubble: What The Internet is Hiding From You” and how personalized filter bubbles can be bad for democracy. Well today I’ve discovered a new perspective on the subject by an author that agrees that the filter bubble can prove troublesome for democracy and users but criticizes Pariser’s solutions to the problem. The article starts off by summarizing what Eli Pariser said about filter bubbles and Facebook at the TED conference that there is a shift in the way information is flowing online and if we don’t pay attention to it, it could be a real problem. This is where the article suggests that Eli Pariser’s solutions would make things worse saying that it is not always a problem. In fact the algorithmic editing or hidden code at work on some websites is reasonably harmless.
For example, if your shopping on Amazon.com and looking at books written by Malcolm Gladwell and biographies by Mozart Amazon’s algorithm engine might recommend you to read books by Alex Ross in their more items to consider section. They do this because Alex Ross writes about the history of music and like Malcolm Gladwell also writes for The New Yorker. Using their algorithms Amazon fills in the banks for you so you don’t have to. However, if you do a Google search for Egypt on your PC and a friend in another town or state does the same on a MacBook Pro you would expect to still get the same results. Surprisingly that is not the case at all. Google’s website can tell where you are, what browser your using and even what computer you have then using these signals it can provide you what it thinks is the best result.
So when Eli Pariser compared the Google search results for “Egypt” from two friends he was shocked. Apparently, one saw information about the political crisis and the protests in Tahir Square but the other only got a list of travel agents and fact books about the country. Here in the political sphere Telegraph writer Will Heaven agrees with Eli Pariser that invisible algorithms have disturbing implications. Websites like Yahoo News are already “personalizing” their coverage and others like Huffington Post.com are apparently flirting with similar technology. Suppose your researching information on President Barack Obama from an impartial news site but it knows you just visited his campaign website so even if you don’t realize it you may end up receiving biased results. News websites will get to know what you want to read and spoon food you more of the same. As a result you will only get to access information these websites think are personally relevant to you. Anything that challenges or broadens your world view won’t be visible. So the only information you see will be the information you want to see not what you need to see. In my next article will discuss Eli Pariser’s comments at the TED conference about Facebook’s News Feed and filter bubbles in more detail.
- First Blog Entry Fri Jul 29 (maneeshpangasa.wordpress.com
- Second Blog Entry Tues Aug 2nd (maneeshpangasa.wordpress.com)
- Third Blog Entry Tues Aug 2nd (maneeshpangasa.wordpress.com)
- Fourth Blog Entry Fri Aug 5th (maneeshpangasa.wordpress.com)
- Fifth Blog Entry Wed Aug 17 (maneeshpangasa.wordpress.com)
- Sixth Blog Entry Thurs Aug 18 (maneeshpangasa.wordpress.com)
- Is there a filter bubble? (bbc.co.uk)
- A Web of One (vgiordano.wordpress.com)
- Fighting the Filter Bubble (insideview.ie)