It was the beginning of 1959 and something was not quite right with John Nash. The famed mathematician, who would eventually go on to win the Nobel Prize in Economics for his work on game theory, was having a mental breakdown.
What started with innocent jokes about finding patterns in licenses plates progressed into a full-fledged belief that extraterrestrials were sending him decoded messages through the New York Times. At one point, Nash also became convinced that more men around Boston were wearing red neckties to get him to notice them. He was seeing patterns that weren’t there.
Though no one knows why approximately 1 percent of the population in all countries develop schizophrenia, seeing patterns between non-related things (the technical term is Apophenia) is a common symptom. You may already know Nash’s story from A Beautiful Mind, but understanding why Nash believed what he did provides an important lesson for investors.
In late 1995, following his recovery from schizophrenia, Nash was asked why he believed so many illogical things in his past. Nash’s answer revealed a core truth about how humans perceive patterns:
The ideas I had about the supernatural beings came to me the same way that my mathematical ideas did. So I took them seriously.
This is the problem that you, I, and every other human on Earth are born with — we have no faulty pattern detector. Just like John Nash couldn’t discern between reality and the spurious patterns he saw, we cannot easily tell the difference between a signal and noise. This limitation did not hinder us in a world of small sample sizes and no need for probabilistic thinking, however, the modern world has changed all of that.
For example, in ancient times, if you spotted a tiger in the same cave three times, you probably wouldn’t go near that cave in the future. Today, if a mutual fund has outperformed for three years in a row, you might send them all your money. Different times, but similar thinking.
This matters for you as an investor, because you may end up making investment decisions based on seeing patterns that are nothing more than random chance. Many investors tend to chase performance by investing in those funds/sectors with the best recent track record. However, there is plenty of evidence that doing so typically leads to worse performance in the long run (see here, here, here, and here).
The thing that many investors tend to forget is that randomness will always exhibit some patterns. In fact, the British mathematician Frank P. Ramsey proved mathematically that no matter how complicated you make a system, as it grows in size it will have to show some substructure. This is known as Ramsey’s theorem and illustrates why patterns exist within randomness.
To make this idea clearer, consider the following two sequences of 10 coin flips:
Which sequence is more likely?
If you have ever studied statistics you will know that this is a trick question — both sequences are equally likely. I know that this answer doesn’t feel right, but that is the point of this post. It seems rational to assume that sequence 2 is more random than sequence 1 because it exhibits less of a pattern. After all, if someone came up to you and flipped 10 heads in a row, you would start to question whether their coin was rigged, right? However, sequence 2 has the same probability of occurring as sequence 1 (1 in 1,024).
If we imagined flipping a coin 400 times and plotting it on a 20×20 grid, you might see some smaller patterns though the sequence is completely random (Note: red = tails, black = heads):
My point is that many of the financial decisions you make throughout the rest of your life will be dominated by small sample sizes where randomness will likely play a role. If you combine this with the recency bias (i.e. the tendency to overweight the most recent information), you can see how short term patterns could affect your financial decision making. Therefore, before you make a financial decision, consider how chance could be influencing your decision.
The Disguise of Randomness
One of my professors in college used to do a wonderful exercise that beautifully illustrated the nature of randomness. He would give someone a coin and a sheet of paper and then ask them to do two things in private:
- Write down what they thought a sequence of 20 coin flips would look like.
- After doing step 1, flip the real coin 20 times and write down the actual sequence of flips elsewhere.
Within seconds of being shown the two sequences, my professor could always tell which came from the real coin and which was simulated. How did he do this?
My professor realized that people tend to switch back and forth between heads and tails too quickly in their simulations, while the real coin typically wouldn’t. Most people think that seeing 4 or more heads (or tails) in a row seems un-random so they balance their simulated sequence out with more tails (or heads). Ironically, this behavior makes their sequence less random and easier to identify when compared to the real coin sequence.
I loved this exercise because it demonstrated how randomness likes to disguise itself with patterns. While it will always be a challenge to identify this disguise, it doesn’t mean we shouldn’t try. Thank you for reading!
If you liked this post, consider signing up for my newsletter.
This is post 54. Any code I have related to this post can be found here with the same numbering: https://github.com/nmaggiulli/of-dollars-and-data