Create a free account, or log in

Are we getting caught up in Goggle’s “filter bubble”? Hill

Filters are at the core of what makes the internet useful to us. If you had to rationally comprehend all of the information available to you for a simple Google search about something like “bubbles” your head would explode.   Luckily, Google filters the 74,900,000 results and based on 57 contextual signals like location and history, […]
SmartCompany
SmartCompany

Filters are at the core of what makes the internet useful to us. If you had to rationally comprehend all of the information available to you for a simple Google search about something like “bubbles” your head would explode.

 

Luckily, Google filters the 74,900,000 results and based on 57 contextual signals like location and history, gives me the 10 that are most likely to be relevant to what I am asking for (in my case, the Bubble Shooter flash game).

At the popular TED conference, Eli Pariser recently warned that “as web companies strive to tailor their services (including news and search results) to our personal tastes, there’s a dangerous unintended consequence: We get trapped in a “filter bubble” and don’t get exposed to information that could challenge or broaden our worldview.”

The issue is very real, but it is also not new. Humans do have preferences, and humans build these robotic sorting algorithms.

In high school when I would visit the library to learn more about a topic for a school project I would ask the librarian to recommend a book for me. My mates would also ask for recommendations and the results would often be quite different based on what the librarian knew of us as individuals. What was a good recommendation for me was not necessarily a good recommendation for my friend, despite the fact that we were asking for the same thing. For this reason I would over time prefer recommendations from one or two particular librarians who knew more about me and what I had liked in the past. The more context they had, the more relevant recommendations they could give.

The sorting algorithms that are built into the most popular web platforms take this idea to a new extreme, where a rational mindset builds these sorting algorithms to optimise based on the query provided with as much context as possible. In his presentation, Eli Pariser asked two of his friends to search Google for “Egypt” at the peak of the Nile revolution. Despite using the same search term, Scott Heiferman with his interest in politics got live political news while the other friend received links to a range of travel and tourism options for the region. Both got what they asked for.

So what does that mean for the filter bubble?

There is no bubble. As humans we need to take responsibility for our worldview and be aware that each set of filters will give us differing results.

Google, Twitter, my mother and my barista are all going to give me different advice each through their own filters when I ask them what I should know about “bubbles”. It is up to me to take that into account and to ask a diverse set of sources if I am to gain the benefit of serendipity and to grow my awareness.

But let’s dig a little deeper into the results. When Eli’s friend got tourism links for his “Egypt” search, he also got a link to the Wikipedia page for Egypt. That page actually has a section about the 2011 revolution, with further links to a dedicated page with much more detail. Although Google wasn’t recommending news articles in its results, it was still providing a path for him to discover the political status of the country, if he wished.

Who is to decide which results should be shown? The web isn’t here to tell us what we “need”. The most difficult thing for me about the talk about the filter bubble is the notion that the web “should” give use what we “need” rather than giving us what we “want”. In a rather pointed example, Facebook founder and CEO Mark Zuckerberg said that “A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa”.

The filters are giving us what we ask for whether that is narrow or diverse. The most important piece to consider is that it is up to the user to ask for that information.

This is certainly something you should consider next time you find yourself staring into the search box. The machines will only give you narrow results if you ask them to. Try asking them to expand your awareness.

Ross Hill is immersed in the world of innovation and entrepreneurship. At Yammer, the enterprise social network used by 100,000 organisations, he works with customers across Asia Pacific to transform their internal communications. Ross is also one of the founders of the entrepreneurial event The Hive and a micro-trustee of Melbourne’s Awesome Foundation.

This article first appeared on Technology Spectator.