Researchers Explore Why We Think We're Right When We're Wrong
The result is a new bias named the "illusion of information adequacy"
Hiya!
Have you ever talked with someone who insists they’re right, even when they aren’t? Or get a bit of road rage when you’re in a hurry, and the car in front of you seems to stop for no reason other than to make you late, so you honk only to discover the car was waiting for a pedestrian you hadn’t noticed? What is it that makes us so confidently wrong?
That’s precisely what a team of researchers wanted to find out. So, they conducted a study to better understand how people make judgments about people or situations based on their confidence in the information they have, even when what they know isn’t the whole picture. The result adds another bias we should all be mindful of — the illusion of information adequacy.
The Study
The new study, published in the open-access journal PLOS ONE on October 09, 2024, is a collaboration led by Hunter Gehlbach, an educational psychologist at Johns Hopkins University’s School of Education. Gehlbach, along with Carly Robinson, a senior researcher at Stanford University’s Graduate School of Education, and Angus Fletcher, a professor of English at Ohio State University, designed an experiment to measure people’s confidence levels in decision-making based on the information they have.
The researchers recruited 1,261 Americans, the majority of which were White (71 percent) and male (59 percent), with an average age of about 40.
All participants read a fictitious story titled “Our School Water is Disappearing." The story involves a school running out of water because its local aquifer is drying up. The school must decide whether to stay put and hope for more rain in the future or merge with another school, which entails other risks.
After reading the article, the participants provided their opinions about which decision the school should make — but of course, there’s a twist. The researchers had three versions of the fake article.
One group of 503 participants read a version of the article that offered three arguments in favor of the schools merging and one neutral point. In contrast, 506 other participants read a version that included three arguments against the merge with one neutral point.
The last group, 252 participants who served as the control, read a version of the article that provided both sides' perspectives, including all seven arguments (three pro-merge, three pro-separate, and one neutral).
After the participants read their articles, the researchers asked what they thought the school should do and how confident they were that they had all the information they needed to make their judgment.
The Results
The researchers discovered that most people who read the biased articles were far more likely to agree with the stance their version of the article argued for. They were also highly confident the article provided all the information they needed to form their decision. Further, both groups were confident that most people would share their opinions.
Meanwhile, the control group, who read arguments for and against the schools merging, were less confident in their opinions about which choice the school should make. In a press release by Ohio State University, Fletcher, who co-wrote the study, explained:
“We found that, in general, people don’t stop to think whether there might be more information that would help them make a more informed decision. If you give people a few pieces of information that seems to line up, most will say ‘that sounds about right’ and go with that.”
People's agreement with the arguments they read, either for or against the school merger, aligned with the researchers' hypotheses, but they were surprised by the next stage of the experiment.
After the participants provided their initial opinions, the researchers provided them with information from the opposing side that contradicted the first article they read. The researchers write in the study:
“We anticipated that the original information these treatment groups received — despite its partial nature — would help participants form opinions that would be hard to reverse, even in the face of learning compelling information to the contrary. Our data did not support this hypothesis.”
When the participants were presented with all the facts after reading both articles, they were often willing to change their minds and became more aligned with the control group in that they became less confident in their original judgment.
That said, the researchers note that these results don’t likely apply to situations where people have pre-established beliefs about a situation, such as politics. Fletcher explained to Kaitlin Sullivan of NBC News:
“People are more open-minded and willing to change their opinions than we assume,” However, “this same flexibility doesn’t apply to long-held differences, such as political beliefs.”
Fletcher adds in the press release that,
“[M]ost interpersonal conflicts aren’t about ideology. They are just misunderstandings in the course of daily life.”
It’s promising that people are willing to change their minds about situations, at least when those situations don’t threaten their core beliefs. Still, it’s concerning, and potentially dangerous, how easy it is to think we know more than we do. So, why does the illusion of information adequacy happen?
Why
The researchers say their findings complement other research on a bias known as naïve realism. This bias occurs when we believe that our subjective experiences mirror objective reality or that our understanding of a situation is the objective truth.
While naïve realism focuses on how people come to different understandings of the same situation, the illusion of information adequacy shows that if people have enough information, they may end up agreeing or sharing an understanding of a situation.
Fletcher points out in the press release that the study suggests the illusion of information adequacy results from a sort of default mode we experience, which causes us to think we know all the relevant facts, even if we don’t. The researchers touch on this issue when they say:
"A major source of misunderstanding and conflict in our daily lives arises from this paradox: We know that, in theory, there are plenty of things that we don't know we don't know. Yet, in practice, we almost always behave as though we have adequate information to voice our opinions, make good decisions, and pass judgment on others."
But why does this default mode exist? Shouldn’t we want to gather as much information as possible before making a decision? Well, like so many of our other biases, the likely explanation can be traced back to our innate survival skills.
After all, sometimes we don’t have the luxury of gathering information, especially when our lives are on the line. In this case, we gotta make quick and confident decisions based on little information.
Of course, our lives aren’t typically at risk these days, but our brains have retained many of our instinctual behaviors and biases. While jumping to conclusions may have worked out well in the past and sometimes still does, we need to be more aware of these aspects and think before we act. So, how can we protect ourselves from the illusion of information adequacy? Fletcher has a simple answer:
“Your first move when you disagree with someone should be to think, ‘Is there something that I’m missing that would help me see their perspective and understand their position better?’ That’s the way to fight this illusion of information adequacy.”
Perspective Shift
We have never had access to as much information as we do now, yet the same access can result in an illusion that we know more than we do and lead us into echo chambers. Critical thinking is more important now than ever.
It seems to me that as technology advances, and our reliance on it, our critical thinking skills have decreased. Similarly, our laser focus on exploring and understanding our shared external world reduces our attention to the subjective world that’s unique to us. The result is a lack of understanding of emotions and a tendency to react rather than respond to a situation or disagreement.
Along the same lines, the illusion of information adequacy makes us confident in our wrongness. Now that we know about this default instinct, we can pay better attention to our reactions to information. When reading an article, we should pause to consider whether it’s one-sided or at least read multiple articles about the same topics to gain more information.
You’re reading my free newsletter, Curious Adventure. If you want more, consider subscribing to Curious Life — which you receive sneak peeks of every Monday morning. This newsletter explores a diverse range of topics and further explores this Curious Adventure we call Life.
These articles require several hours, sometimes days, of research, writing, and editing before publication. The subscription fee helps me pay my bills so I can continue providing high-quality content and doing what I love — following my curiosities and sharing what I learn with you.
If you enjoy my work and want to show me support, you can donate to my PalPal. Thank you for reading. I appreciate you.
I wonder if the results of the study would have changed much (or at all) if the group had been composed of primarily females rather than males?