Filter Bubbles

What would happen if you were only exposed to ideas that you agree with? It’s good to disagree sometimes, and often being wrong is the best way to learn what’s right. But internet algorithms are doing exactly this, exposing us to only the information we agree with, while leaving out information that might challenge our views. The algorithms serve this type of content because we are more likely to both share and engage with it, therefore keeping us hooked. 

Now we’re trapped in a cycle of like-minded people and content. 

This cycle trains us to close our minds to other approaches and dismiss compromise as “bargaining with an enemy.” Eventually, we’re taught to dehumanize this “enemy,” making it impossible to empathize with those who disagree with us. 

Think hard about the political information you engage with on a daily basis. Are you caught in a filter bubble?

We can’t let the algorithms tell us what we can and can’t learn. Pop the filter bubble and think for yourself.

More resources about filter bubbles and how to pop them:

Top