PSYCHOLOGY
Research: Learning a Little AboutSomething Makes UsOvercondentby Carmen Sanchez and David Dunning
MARCH 29, 2018
HBR STAFF/TOM KELLEY ARCHIVE/GETTY IMAGES
As former baseball pitcher Vernon Law once put it, experience is a hard teacher because it gives the
test first, and only then provides the lesson.
Perhaps this observation can explain the results of a survey sponsored by the Association of
American Colleges & Universities. Among college students, 64% said they were well prepared to
work in a team, 66% thought they had adequate critical thinking skills, and 65% said they were
proficient in written communication. However, among employers who had recently hired college
students, less than 40% agreed with any of those statements. The students thought they were much
further along in the learning curve toward workplace success than their future employers did.
Overcondence Among Beginners
Our research focuses on overconfidence as people tackle new challenges and learn. To be a beginner
is to be susceptible to undue optimism and confidence. Our work is devoted to exploring the exact
shape and timeline of that overconfidence.
One common theory is that beginners start off overconfident. They start a new task or job as
“unconscious incompetents,” not knowing what they don’t know. Their inevitable early mistakes
and miscues prompt them to become conscious of their shortcomings.
Our work, however, suggests the opposite. Absolute beginners can be perfectly conscious and
cautious about what they don’t know; the unconscious incompetence is instead something they
grow into. A little experience replaces their caution with a false sense of competence.
Specifically, our research focused on the common task of probabilistic learning in which people
learn to read cues from the environment to predict some outcome. For example, people must rely on
multiple signals from the environment to predict which company’s stock will rise, which applicant
will do the best job, or which illness a patient is suffering from. These can be hard tasks — and even
the most expert of experts will at times make the wrong prediction — but a decision is often essential
in many settings.
In a laboratory study, we asked participants to imagine they were medical residents in a post-
apocalyptic world that has been overrun by zombies. (We were confident that this would be a new
scenario to all our participants, allowing them all to start as total novices.) Their job, over 60
repeated trials, was to review the symptoms of a patient, such as whether the patient had glossy
eyes, an abscess, or brain inflammation, and diagnose whether the patient was healthy or infected
with one of two zombie diseases. Participants needed to learn, by trial and error, which symptoms to
rely on to identify zombie infections. Much as in a real-world medical diagnosis of a (non-zombie)
condition, the symptoms were informative but fallible clues. There were certain symptoms that
made one diagnosis more likely, but those symptoms were not always present. Other potential
symptoms were simple red herrings. Participants diagnosed patients one at a time, receiving
feedback after every diagnosis.
The Beginner’s Bubble
We found that people slowly and gradually learned how to perform this task, though they found it
quite challenging. Their performance incrementally improved with each patient.
Confidence, however, took quite a different journey. In each study, participants started out well-
calibrated about how accurate their diagnoses would prove to be. They began thinking they were
right 50% of the time, when their actual accuracy rate was 55%. However, after just a few patients,
their confidence began skyrocketing, far ahead of any accuracy they achieved. Soon, participants
estimated their accuracy rate was 73% when it had not hit even 60%.
It appears that Alexander Pope was right when he
said that a little learning is a dangerous thing. In our
studies, just a little learning was enough to make
participants feel they had learned the task. After a
few tries, they were as confident in their judgments
as they were ever going to be throughout the entire
experiment. They had, as we termed it, entered into
a “beginner’s bubble” of overconfidence.
What produced this quick inflation of confidence? In
a follow-up study, we found that it arose because
participants far too exuberantly formed quick, self-
assured ideas about how to approach the medical
diagnosis task based on only the slimmest amount of
data. Small bits of data, however, are often filled
with noise and misleading signs. It usually takes a
large amount of data to strip away the chaos of the
world, to finally see the worthwhile signal. However,
classic research has shown that people do not have a
feel for this fact. They assume that every small
sequence of data represents the world just as well as long sequences do.
But our studies suggested that people do eventually learn — somewhat. After participants formed
their bubble, their overconfidence often leveled off and slightly declined. People soon learned that
they had to correct their initial, frequently misguided theories, and they did. But after a correction
phase, confidence began to rise again, with accuracy never rising enough to meet it. It is important
to note that although we did not predict the second peak in confidence, it consistently appeared
throughout all of our studies.
A Real-World Bubble
The real world follows this pattern. Other research
has found that doctors learning to do spinal surgery
usually do not begin to make mistakes until their
15th iteration of the surgery. Similarly, beginning
pilots produce few accidents — but then their
accident rate begins to rise until it peaks at about
800 flight hours, where it begins to drop again.
We also found signs of the beginner’s bubble outside
of the laboratory. As with probabilistic learning, it
has been shown that most people under the age of
18 have little knowledge of personal finance. Most
primary and secondary educational systems do not
teach financial literacy. As such, personal finance is
something most learn by trial and error.
We found echoes of our laboratory results across the
life span in surveys on financial capability
conducted by the Financial Industry Regulatory
Authority. Each survey comprised a nationally
representative sample of 25,000 respondents who
took a brief financial literacy test and reported how knowledgeable about personal finance they
believed they were. Much like in the laboratory, both surveys showed that real financial literacy
arose slowly, incrementally, and uniformly across age groups.
Self-confidence, however, surged between late adolescence and young adulthood, then leveled off
among older respondents until late adulthood, where it began to rise again — a result perfectly
consistent with our laboratory pattern.
It is important to note that our work has several limitations. In our experiments, participants
received perfect feedback after each trial. In life, consistent feedback like this is often unavailable.
Also, our tasks traced how confidence changed as people learned truly novel tasks. There are plenty
of tasks people learn in which they can apply previous knowledge to the new task. We do not know
how confidence would change in these situations. Relatedly, we cannot be certain what would
happen to overconfidence after the 60th trial.
With that said, our studies suggest that the work of a beginner might be doubly hard. Of course, the
beginner must struggle to learn — but the beginner must also guard against an illusion they have
learned too quickly. Perhaps Alexander Pope suggested the best remedy for this beginner’s bubble
when he said that if a few shallow draughts of experience intoxicate the brain, the only cure was to
continue drinking until we are sober again.
Carmen Sanchez is a PhD candidate in Social and Personality Psychology at Cornell University. She studies howperceptions of abilities change as people learn, cultural differences in self-enhancement, and nancial decision-making.
David Dunning is a Professor of Psychology at the University of Michigan. His research focuses on the psychology ofhuman misbelief, particularly false beliefs people hold about themselves.