Heard of filter bubbles?
You may or may not have heard of the concept of filter bubbles but if you are a regular user of internet search engines or social media platforms there is a good chance you are in the middle of one right now.
In the video from GCF Global below you will learn how algorithms designed to improve your user experience in a digital world can actually isolate you. Algorithms can often curate content for you where you are only being given news stories and social media posts biased to your existing beliefs. This means you can become isolated from differing viewpoints and perspectives. Examples of this in action can be seen when doing a search through an internet browser, or, perhaps you have noticed how Facebook might place news articles that are aligned with your political beliefs above those that might contradict them.
Eli Pariser first defined filter bubbles in his well known Ted Talk where he stated algorithms have become the “gatekeepers” for information online in the same way that editors curate what gets into magazines and newspapers. The trouble is that these algorithms do what they do without the ability to consider the implications of the choices they make
The challenge in the context of 'alternative facts' or 'fake news' is that filter bubbles have a particularly damaging reach. 'Fake news' stories often contain exaggerated, falsified claims or click bait that algorithms looking for specific keywords, tags, or, high engagement such as likes and shares, would likely be attracted to. Alternative facts are often presented in a way that appeals to fringe or extreme viewpoints and as a result can generate greater engagement rates resulting in a further algorithmic boost.
People who understand how these algorithms work can manipulate them to meet their own goals - to influence your thoughts and opinions. The algorithms that Facebook, Google and other platforms use to present information are not at a stage where they can measure quality or veracity and this can create a perfect environment for viral, inflammatory, often untruthful content to rank at the top of news feeds and search results.
Across the political spectrum manipulation of these algorithms and the filter bubbles they have created have influenced elections, referendums and contributed to some of the rise in the more radicalised views of both the far right and left. The echo chamber created by filter bubbles both clouds the ability of users to hear the opinions of the other side while pushing viewers toward more and more extreme views.
Try to seek out content and discussion that challenges your preconceptions, the way you think, and, your beliefs. Source content from more than one newspaper, book, or, digital media outlet about a topic. Listen to and try to understand different viewpoints and perspectives to your own even though you may not agree. Think logically and rationally about the content you read or hear. What is the purpose of that content, is it credible, and what is the content or author trying to achieve?