The CAA has been reading a lot about cognitive science lately. Confirmation bias – the tendency of individuals to filter out those stories and facts that challenge what they already believe – has been on my mind ever since I learned of it during a School for Creative Activism this summer. The concept is shockingly obvious when you consider the increasing tendency toward political polarization in the US recently.
But it’s not just cognition that leads to confirmation bias – technology does it too. The algorithms that personalize the ads we see on Facebook and the order in which search hits show up in Google all contribute to a filter bubble that only exacerbates the effects of confirmation bias.
What’s the problem with this? I actually don’t mind the fact that advertisements for guns, fast food, and unnecessarily huge trucks are generally filtered out of my e-scape. But if I want to change the political terrain, I’m going to have to know what that terrain is. I’m going to have to leave the safety of my filter bubble, step outside, and talk to people who don’t spend their Friday nights listening to Radiolab and canning organic tomatoes. Leaving the bubble means paying attention to the stories that other people are reading.