How well do you trust what you read online and in social networks? It is highly likely that the quality of digital texts and links that you obtain as you read search and sift through the internet has been slowly degrading. This is because without knowing it, something called the filter bubble is sifting and limiting your learning process. This may have the ultimate effect of limiting your worldview and the connections that you make online. This also has the potential to limit your students as they read, write, and participate in a networked, global economy. Many of us believe that we live in a connected world where the web affords unprecedented learning opportunities that make information plentiful and put experts at our fingertips. It turns out that this may not be entirely true.
Living in a filter bubble
The reason for this is that before you even start searching and sifting for information online, something has already sifted that information for you. In other words, each one of us sees a pre-curated list that’s the output of a computer based process or set of rules used in problem solving operations called an algorithm. The websites and social networks that we use for research, news gathering, and watching cat videos increasingly use algorithms to filter the results to make them more pleasing to us. In this situation, algorithms are being used to determine what would be most pleasing for you. This may include search engines, news sites, or social networks providing you with what they think you want to see, not with the broad selection of what is out there.
The result of all of these algorithms is to put us in what is called a “filter bubble,” a term coined by activist Eli Pariser. A filter bubble may close us off to new ideas, subjects, and important information as algorithms decide on our behalf what we see or don’t see. Most of the time when we search online, it is good news that we can quickly find the exact thing that we’re looking for. Algorithms are to thank for that. The challenge, however, is that, as algorithms increasingly determine which texts we read, readers are given a disturbingly incorrect idea of what is happening in the world around us. Individuals often gravitate to ideas that are familiar and those which align to their existing perspectives. This develops a confirmation bias which is a tendency to search for, interpret, favor, recall information that confirms your pre-existing beliefs or hypotheses while giving less consideration to alternate possibilities. This is increasingly problematic given the recent rise in the amount of fake news and disinformation sites available online.
An example from Facebook
If you’ve ever wondered why you have thousands of friends on Facebook, but you see the same couple of friends in your feed when you login, this is because the algorithm has determined that this is what would please you the most. If you don’t see anything at all from other people, the algorithms have once again determined this is the best for you. Facebook is but one example of the multiple times and ways in which algorithms determine what you’ll see before you even see it. This is happening with many other social networks, search engines, and news gathering websites.
While we’re on the subject of Facebook, let’s dig in a bit deeper. Go to www.facebook.com/ads/preferences. You’ll have to login to Facebook before you can see your ad preferences. This page will detail your ad preferences. They compile this information from your posts, likes, and collection of friends on the network. They also track you as you move and search online. Those little buttons you see on other websites asking you to “like” their page on Facebook are tracking you as you move across the web. You don’t even have to be logged in for them to keep track of your history. Please also remember that Facebook is not the only company doing this.
While you’re on your Facebook ad preferences page, click through the different tabs to see what information the social network has about you. They sell all of this information to advertisers. Facebook has its own determination of your political views based on your activity. Under the “Interests” tab, click on the “Lifestyle and Culture” tab. In this section, you’ll find a box titled “US Politics.” This information is useful to advertisers and political campaigns who want to send you news, ads, and posts about containing their message.
What can you do?
For the most part, there is not much that we can do to break free from this filter bubble. Many sites are increasingly using personalized search tools to manipulate your feed of information. Within Facebook, Twitter, or other social networks, you can work to address some of this by following individuals or groups that have different perspectives than your own. You can also use a search engine like DuckDuckGo, or routinely use “Incognito mode” on Google Chrome or on other browsers. Finally, there are many others Chrome extensions that are great for protecting your privacy and stopping others from tracking you online.
I think the best defense against filter bubbles may simply be “awareness”. Recognize that filter bubbles do exist and create a very real echo chamber that impacts your potential for literacy and learning. You should also discuss this with your students and investigate methods to actively connect with individuals or groups that have perspectives than their own. You can start this dialogue by watching Eli Pariser describe the dangers of filter bubbles (YouTube link). From there it needs to be an active fight on the part of every individual to not simply trust what they read online. We need to build the healthy skepticism of all individuals as they become thoughtful, critically aware, literate individuals.
This is the unedited version of a post that originally appeared on the Literacy Daily blog from ILA. If you like posts like this, you should subscribe to my weekly newsletter.
Thanks to Doug Belshaw for the title and careful revisions.
flickr photo by Indigo Skies Photography https://flickr.com/photos/indigoskies/6300407679 shared under a Creative Commons (BY-NC-ND) license
Also published on Medium.Also on: