Filter Bubble

"Filter bubble" is one of the most talked about terms in the internet industry in recent years. The concept was first proposed by Internet activist Eli Pariser in his book "The Filter Bubble: What the Internet Is Hiding from You". A filter bubble is the intellectual isolation that can occur when websites make use of algorithms to selectively assume the information a user would want to see, and then give information to the user according to this assumption.

Websites make these assumptions based on the information related to the user, such as former click behavior, browsing history, search history and location. For that reason, the websites are more likely to present only information that will abide by the user's past activity. A filter bubble, therefore, can cause users to get significantly less contact with contradicting viewpoints, causing the user to become intellectually isolated.

Personalized search results from Google and personalized news stream from Facebook are two perfect examples of this phenomenon. According to Mark Zuckerberg, he thinks:"Rarely do we go past the page-1 of our Google searches. Highly filtered results (which most of us prefer - living in a bubble), meaning other stuff gets demoted. And the personalisation increases as algorithm gets more training on your interests, and thus the wall of bubble goes thicker and thicker."

The biggest problem of flitter bubble is it will prevent people from understanding perspectives of people with different views. People and algorithms can behave similarly. For example, if someone wanted to only hear perspectives that confirmed my world view, any time he/she encountered someone with a different perspective, the following could happen: 1. I’d feel bad because someone disagreed with me. They don’t even have to directly disagree with me. They might just state their opinion, I might disagree with it, and feel bad about it. 2. If I feel bad, I might adopt behaviours to avoid feeling bad again ranging from not bringing up the topic with the other person to avoiding them entirely and cutting them out of my social circle. 3. Or maybe the disagreement causes the other person to choose not to engage with me and no longer share their differing perspective with me.

Algorithms seem to tend to be programmed to continue to show our content that you like and show less of content that you don’t like - in a similar fashion the people-programming above. Social media is notorious for this. Their algorithms suggest you follow people or information that are already followed by our existing circle. They show people products and content based on our "preferences," and if we like things that fit our point of view, it becomes a cycle of self-affirmation.

This leads to confirmation bias, like results on the Search Engine Results Page affirm existing viewpoints and inhibit user’s ability to be autonomous thinkers. Pariser states, “algorithms need to encode a sense of public life and civic responsibility”. If our search engines fail to provide an unbiased flow of information, this will only continue to fuel our ignorant minds and narrow our viewpoints. Individuals have the tendency to accept what reinforces their pre- existing beliefs, even when presented with contradicting proof. According to Psychology Today “confirmation bias occurs from the direct influence of desire on beliefs. When people would like a certain idea/concept to be true, they end up believing it to be true. We pick out those bits of data that make us feel good because they confirm our prejudices. Thus, we may become prisoners of our assumptions”. The notion of search engines gives the illusion that results are fair; however, filtering algorithms based on relevance fails to introduce us to new ideas and different perspectives. Google’s algorithms perform the function of delivering “relevant content” based on records of engagement and what others like us, and around us do. Users seclude themselves to their personal filter bubbles, leading to polarization and limited perspectives; accordingly, they become confined to their unique universe of information provided to them online.

We need the internet to keep us well-versed, and unbiased, introducing us to different perspectives, and not dissuading us from thinking critically. In order to avoid confirmation bias users must seek out reputable sources. Also, when researching a topic, users need to be cognizant of reading issues across the political spectrum in order to hear all sides of an argument. Seeking out people with opposing opinions and views will work in their favour to avoid confirmation bias, and give greater insight into new perspectives. Finally, in order to think critically, one must play devil’s advocate and try to argue their own opinion. Users need to challenge their own knowledge to avoid being enslaved by their own cultural or ideological bubbles employed by the nature of the internet.

One of the problems that might arise in the future is that the algorithm might be aware of the danger of filtering bubbles and try to block some, for example, by providing us with new songs in addition to the previous auto-recommended ones. This remains a great challenge for our species. Of course, the best way is to legislate on relevant issues as soon as possible. In addition, popularizing artificial intelligence knowledge and improving the overall moral quality of citizens are also indispensable.

What to do: Ways to expose us to different perspectives and avoid filter bubbles: 1. Get comfortable with disagreement. This may require meditation, deep breaths, big pauses, and restraint. 2. Make your own decisions and stop doing what algorithms suggest you do. 3. Intentionally seek out people in different geographies, from different backgrounds with different views. 4. Keep an open mind, but know what your deal-breakers are. 5. Say yes to invitations from people outside your usual social circle. 6. Practice more listening and understanding and less analyzing and judging. 7. Spend less time on social media. It is so much healthier to spend time with others in person and have conversations. If there’s a difference in opinion or perspective, there is more of an opportunity to discuss and learn from each other in person versus through asynchronous media like social media.



Other articles to read on this topic:

  1. How Filter Bubbles Distort Reality: Everything You Need to Know
  2. Artificial Intelligence Created Filter Bubbles. Now It’s Helping To Fight It
  3. The Causes And Effects Of “Filter Bubbles” And How To Break Free




Written by: Yongzhe Jiang