Dan Kahan, a professor at Yale Law School, was interested in understanding how an individual’s beliefs affected their recollection of an event. In order to test this he had two groups of students watch a video of a protest. One group was told that the protest was against legalized abortion and the other group was told that it was against the ban on openly gay/lesbian soldiers from serving in the military. After collecting information about the students’ political beliefs, Kahan found that the students’ reactions to the video were highly correlated with their pre-existing views.
For example, those participants that were against legalized abortion, “saw” the protest as a peaceful expression of dissent, while those that were for legalized abortion “saw” the protest as an act of physical intimidation. From the experiment:
Subjects of opposing cultural outlooks who were assigned to the same experimental condition (and thus had the same belief about the nature of the protest) disagreed sharply on key “facts”—including whether the protestors obstructed and threatened pedestrians.
Here’s the rub though—all the students (across both experimental groups) watched the exact same video. Kahan had cleverly edited the protest video to make it topic neutral. No one saw anti-abortion signs or anything related to “don’t ask don’t tell”, so they were not trigged or persuaded by the content in the video, only by the content in their minds. We see what we want to see.
This concept, known as confirmation bias, demonstrates why we tend to ignore or reject information that goes against our most deeply held beliefs. I know that you have probably heard of this concept before, but with the rise of our digital gatekeepers (Facebook, Google, Twitter, etc.) it is getting easier and easier for you and I to experience confirmation bias. This is true because these systems were optimized for engagement and not truth nor intellectual diversity. Your actions curate your feed and, before you know it, you only see opinions you agree with.
I know I do this too, yet I still cannot help myself. For example, I used to follow @APompliano (“Pomp”) to get a different worldview on the crypto space. However, after I read the tweet below, I got so triggered that I unfollowed him:
Even though Pomp prefaces that he is stating an unpopular opinion, I couldn’t help myself from reacting in such a negative way. This is because this tweet challenges a fundamental belief I have in global equity markets. Pomp could have a valid criticism of public markets, but it doesn’t do me any good to ignore perspectives like this (Note: I have since re-followed Pomp if he ever comes back to Twitter).
My point is that we need to be careful how we curate our feeds, because we can create a false view of reality. I know that you might think that your beliefs come about as a result of deep analysis and critical thinking, but this probably isn’t the case. In Annie Duke‘s book Thinking in Bets: Making Smarter Decisions When You Don’t Have All of the Facts she describes how belief formation actually works:
- We hear something;
- We believe it to be true;
- Only sometimes later, if we have the time or inclination, we think about it and vet it, determining whether it is, in fact, true or false.
Of course this won’t work when we already have a belief about a topic, but for new information it is spot on. For example, did you know that there is more ice in Greenland and Siberia than the north pole and south pole combined? Of course you didn’t, because I just made that up. Maybe there is. Probably not. But if I hadn’t told you that I made that up, you would have done the research to check it, right?
This is why confirmation bias is so prevalent. People don’t like thinking, but they love certainty. Thinking takes mental energy and time, so why do it? Just consider how much of your day is spent on autopilot versus in analytical thought. Autopilot is efficient, and there is nothing wrong with this, but its easy to see how confirmation bias arises as a result of such mental shortcuts.
Keep Your Identity Small
So how should you fight back against confirmation bias? I previously recommend challenging your beliefs, but after reading an incredible article by Paul Graham from 2009, I realized that this advice was incomplete. There is a better way to fight back—keep your identity small. Graham argues that the reason many political and religious discussions tend to be useless is because the topics are tightly linked to an individual’s identity. When you identify with an idea you make it harder to argue about the idea, because you instinctually feel like any disagreement is an attack against YOU and not the idea. This is true whether you are a pro-lifer or a permabear. This explains why I got so triggered by Pomp’s tweet above. I identify as an “optimistic capitalist” and his tweet took jabs directly at my worldview.
The better way to view the world is through what Paul Saffoo calls strong opinions, weakly held. You do research (or use your intuition) to create opinions on an issue, and change those opinions if you receive data that suggests they are incorrect. Think of your beliefs like clothing, not tattoos. You want to be able to easily change and adapt them for you, but not have them become a permanent part of who you are. This will be true for you as an investor and decision maker throughout your life.
If you want to become a better decision maker, I highly recommend buying Thinking in Bets by Annie Duke. Annie used her decision making skills to become a world champion poker player and shares how you can apply her insights to the decisions you make everyday. Everybody I know that has read this book has loved it, and I can tell you with certainty that excerpts of her book will make their way into future blog posts. So, good luck keeping your identity small and thank you for reading!
If you liked this post, consider signing up for my newsletter.
This is post 78. Any code I have related to this post can be found here with the same numbering: https://github.com/nmaggiulli/of-dollars-and-data