At first, we thought our experiment was a humorous and powerful illustration of the fact that when we pay attention to one thing, we can miss other obvious things that happen right before our eyes. But when we reflected on how much this result surprised the participants in our experiment, and most people who hear about the experiment, we realized that it also illustrates a profound gap between how our minds really work and how we think they work. We call this gap the “illusion of attention.” We think we are paying attention to and noticing everything important in our world, but we must really be missing a whole lot. All of us may have invisible gorillas hiding in plain sight at home and at work.
The disconnect between how we think and how we believe we think goes far beyond the domain of vision and attention. For example, intuition and experience tell us that having more information enables us to make better decisions. In this article, we reveal four ways that more can actually be less in decision-making—how having more information can lead to worse decisions.
1. Beware of confidence. If you know what someone else thinks, you have one piece of information. If you also know how confident they are in their opinion, you have two pieces of information. But confidence is more like a personality trait than a valid signal of accuracy, knowledge, or skill; confident people tend to be confident in general, regardless of whether they are right.
If you assign too much weight to the opinions of confident people, you can be led astray. Unfortunately, that is exactly what we tend to do. Experiments by psychologist Gideon Keren show that even when we can directly compare two experts, one who is accurate and one who is overconfident, we prefer the overconfident one. In the case of weather forecasters, whose predictions are tested every single day, it is possible to compile a complete record of both accuracy and confidence.
In most other fields, it is either impossible or impractical to compile such data, and as a result, we have even less ability to separate confidence and accuracy to make the best decision. Unfortunately, confidence (possibly unjustified) can also make it easier to get a job as a TV stock guru, business magazine advice columnist, or management consultant.
2. Don't model the noise. We tend to form simple theories to make sense of complex data. Unfortunately, overly simple theories often wind up modeling the noise rather than the signal. In an experiment by behavioral economist Richard Thaler, investors in simulated mutual funds made much less money when they got monthly performance data than when they got information at one-year or five-year intervals.
The subjects who got more information learned only about short-term volatility, not long-term trends, so the knowledge they extracted was not useful for long-term investing—they based their decisions on the noise rather than the signal. The behavior of the investors who got monthly feedback has a lot in common with that of the people who see Mother Teresa's image in a cinnamon roll or the Virgin Mary on a burnt piece of toast (and then sell these icons on eBay). There is a lot of randomness in the world, and thus many opportunities to form false ideas about it.
3. Don't assume the first thing causes the second. Information about the temporal order of events is incredibly useful, but it can also mislead us. We intuitively assume that when two seemingly-related things happen in sequence, the first must have caused the second.
A common illustration of this error is the conclusion that performance improvements following the hiring of a new CEO must have been caused by the CEO's actions. Billions of dollars in executive compensation are doled out on this basis every year. But other factors that happened to coincide with the new leader's arrival could easily have been responsible, in whole or in part, for the improvements.
The same principle can apply whenever we attribute a change in results to an action taken beforehand—like increased sales following a new advertising campaign. Again, there are many other possible causes that we don't jump to consider, such as changes in the array of competing products, or changes in demand caused by some external factor. Unfortunately, when temporal order is the only information we have, there is no way to know what really caused what.
4. Remember that the information you look for isn't special. We tend to value information that we uncover ourselves more than information provided by others, even when the information itself is identical. Physician Donald Redelmeier randomly assigned doctors in a study to one of two groups. Each participant was told to imagine that they were the only doctor on a flight where a passenger experienced chest pain suggestive of a heart attack. The first group was given two pieces of information about the patient: heart rate and blood pressure. The second group was given the heart rate and asked if they also wanted to know the blood pressure.
The pressure data (systolic pressure of 120) suggested that the patient was not in grave danger. 11 percent of the doctors in the first group recommended continuing the flight (as opposed to landing so the patient could be taken to a hospital). Of those doctors who requested the blood pressure in the second group, 85 percent then recommended continuing the flight, eight times as many as in the first group, even though they had identical information about the patient's condition.
The only difference was that the second group had chosen to get the blood pressure data rather than having it handed to them. Think about how often you seek out “more information” before you make an important decision. Now think about how this information may be having a disproportionate effect on the choices you make, and try to make sure you weigh evidence judiciously regardless of your own role in collecting it.
Christopher Chabris and Daniel Simons are psychology professors and the authors of The Invisible Gorilla, And Other Ways Our Intuitions Deceive Us (Crown/Random House).