How did that happen?

My beady little eye on the world.

My beady little eye on the world.

Since I had no use for either major-party candidate, I went to bed early last night. As the wits were saying, the bad news was that one of them would be elected. However, my phone pinged at 11 PM. It was a friend who is an ardent Hillary supporter, talking about how she wanted to die. After chatting with her, I went on Facebook and saw innumerable shocked and angry posts. (I’m an artist from New York, so the majority of my friends are liberal.) The same people who’d been celebrating all day yesterday for electing the first woman president were posting things along the lines of, “I’m trying to understand. How did this happen?”

In the 1950s, psychologist Leon Festinger developed a theory of cognitive dissonance, which basically says that holding contradictory beliefs is stressful and people will do anything to squirm out of it. The more deeply held the belief, the stronger the dissonance. Among the strategies we use to cope is confirmation bias. We all tend to search for, interpret, and recall information in a way that confirms our preexisting beliefs. When we read and remember information selectively (and we all do), we are engaging in confirmation bias.

Today our best friends are machines that do that biasing for us. The average American spends 11 hours a day using electronic gadgets. All of these have some kind of confirmation bias built in (the channels you select, for example, affect the news and commercials you see) but the most insidious are your computer and your smart phone.

Yes, your computer is watching you, and yes, it is developing a profile for you based on the sites you visit, your search terms, your purchases and social media profile. That’s relatively innocuous when it comes to what salad dressing you buy, but in 2012, the major parties started using the same tools to target political ads. We started seeing advertising that reinforced, rather than challenged, our beliefs.

Our clickstreams also influence the results we get when we are searching. Google has complicated (and patented) algorithms that say that when we search for A, B, and C, the result will be offered in a particular order. That’s based on user history, and it has a tendency to lump us into herds.

Then there’s the fallacy that you choose your friends. Every time you open Facebook, it scans and collects all the posts made by all your friends and ranks them. The algorithm is complicated and hidden, but how frequently you interact with the poster is certainly part of it. So too is hiding similar posts.

Needless to say, you very rapidly weed out the people you don’t particularly like, the ones you find boring, or—in many cases—the ones who disagree with you. Most users only see the top few hundred posts, which they’ve selected through their own internal biases. The machine then takes over and reinforces these biases. The posts you favor influence the posts you see. This is why last night so many people posted things like, “But I don’t know a single person who supported him!”

This creates terribly bad assumptions about our group behavior and further polarizes us. It’s why so many of us were blindsided by the results. I’m not saying you should ditch your computer—heck, I want you to continue reading my blog—but I am saying that you need to test its version of reality.

Carol Douglas

About Carol Douglas

Carol L. Douglas is a painter who lives, works and teaches in Rockport, ME. Her annual workshop will again be held on the Schoodic Peninsula in beautiful Acadia National Park, from August 6-11, 2017. Visit www.watch-me-paint.com/ for more information.