💾 Archived View for gmi.noulin.net › mobileNews › 6663.gmi captured on 2023-06-16 at 17:01:03. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2023-01-29)
-=-=-=-=-=-=-
2019-08-11 11:14:29
By Jonah Lehrer
June 12, 2012
Editors Note: The introductory paragraphs of this post appeared in similar
form in an October, 2011, column by Jonah Lehrer for the Wall Street Journal.
We regret the duplication of material.
Here s a simple arithmetic question: A bat and ball cost a dollar and ten
cents. The bat costs a dollar more than the ball. How much does the ball cost?
The vast majority of people respond quickly and confidently, insisting the ball
costs ten cents. This answer is both obvious and wrong. (The correct answer is
five cents for the ball and a dollar and five cents for the bat.)
For more than five decades, Daniel Kahneman, a Nobel Laureate and professor of
psychology at Princeton, has been asking questions like this and analyzing our
answers. His disarmingly simple experiments have profoundly changed the way we
think about thinking. While philosophers, economists, and social scientists had
assumed for centuries that human beings are rational agents reason was our
Promethean gift Kahneman, the late Amos Tversky, and others, including Shane
Frederick (who developed the bat-and-ball question), demonstrated that we re
not nearly as rational as we like to believe.
When people face an uncertain situation, they don t carefully evaluate the
information or look up relevant statistics. Instead, their decisions depend on
a long list of mental shortcuts, which often lead them to make foolish
decisions. These shortcuts aren t a faster way of doing the math; they re a way
of skipping the math altogether. Asked about the bat and the ball, we forget
our arithmetic lessons and instead default to the answer that requires the
least mental effort.
Although Kahneman is now widely recognized as one of the most influential
psychologists of the twentieth century, his work was dismissed for years.
Kahneman recounts how one eminent American philosopher, after hearing about his
research, quickly turned away, saying, I am not interested in the psychology
of stupidity.
The philosopher, it turns out, got it backward. A new study in the Journal of
Personality and Social Psychology led by Richard West at James Madison
University and Keith Stanovich at the University of Toronto suggests that, in
many instances, smarter people are more vulnerable to these thinking errors.
Although we assume that intelligence is a buffer against bias that s why those
with higher S.A.T. scores think they are less prone to these universal thinking
mistakes it can actually be a subtle curse.
West and his colleagues began by giving four hundred and eighty-two
undergraduates a questionnaire featuring a variety of classic bias problems.
Here s a example:
In a lake, there is a patch of lily pads. Every day, the patch doubles in size.
If it takes 48 days for the patch to cover the entire lake, how long would it
take for the patch to cover half of the lake?
Your first response is probably to take a shortcut, and to divide the final
answer by half. That leads you to twenty-four days. But that s wrong. The
correct solution is forty-seven days.
West also gave a puzzle that measured subjects vulnerability to something
called anchoring bias, which Kahneman and Tversky had demonstrated in the
nineteen-seventies. Subjects were first asked if the tallest redwood tree in
the world was more than X feet, with X ranging from eighty-five to a thousand
feet. Then the students were asked to estimate the height of the tallest
redwood tree in the world. Students exposed to a small anchor like
eighty-five feet guessed, on average, that the tallest tree in the world was
only a hundred and eighteen feet. Given an anchor of a thousand feet, their
estimates increased seven-fold.
But West and colleagues weren t simply interested in reconfirming the known
biases of the human mind. Rather, they wanted to understand how these biases
correlated with human intelligence. As a result, they interspersed their tests
of bias with various cognitive measurements, including the S.A.T. and the Need
for Cognition Scale, which measures the tendency for an individual to engage
in and enjoy thinking.
The results were quite disturbing. For one thing, self-awareness was not
particularly useful: as the scientists note, people who were aware of their
own biases were not better able to overcome them. This finding wouldn t
surprise Kahneman, who admits in Thinking, Fast and Slow that his decades of
groundbreaking research have failed to significantly improve his own mental
performance. My intuitive thinking is just as prone to overconfidence, extreme
predictions, and the planning fallacy a tendency to underestimate how long it
will take to complete a task as it was before I made a study of these issues,
he writes.
Perhaps our most dangerous bias is that we naturally assume that everyone else
is more susceptible to thinking errors, a tendency known as the bias blind
spot. This meta-bias is rooted in our ability to spot systematic mistakes in
the decisions of others we excel at noticing the flaws of friends and inability
to spot those same mistakes in ourselves. Although the bias blind spot itself
isn t a new concept, West s latest paper demonstrates that it applies to every
single bias under consideration, from anchoring to so-called framing effects.
In each instance, we readily forgive our own minds but look harshly upon the
minds of other people.
And here s the upsetting punch line: intelligence seems to make things worse.
The scientists gave the students four measures of cognitive sophistication.
As they report in the paper, all four of the measures showed positive
correlations, indicating that more cognitively sophisticated participants
showed larger bias blind spots. This trend held for many of the specific
biases, indicating that smarter people (at least as measured by S.A.T. scores)
and those more likely to engage in deliberation were slightly more vulnerable
to common mental mistakes. Education also isn t a savior; as Kahneman and Shane
Frederick first noted many years ago, more than fifty per cent of students at
Harvard, Princeton, and M.I.T. gave the incorrect answer to the bat-and-ball
question.
What explains this result? One provocative hypothesis is that the bias blind
spot arises because of a mismatch between how we evaluate others and how we
evaluate ourselves. When considering the irrational choices of a stranger, for
instance, we are forced to rely on behavioral information; we see their biases
from the outside, which allows us to glimpse their systematic thinking errors.
However, when assessing our own bad choices, we tend to engage in elaborate
introspection. We scrutinize our motivations and search for relevant reasons;
we lament our mistakes to therapists and ruminate on the beliefs that led us
astray.
The problem with this introspective approach is that the driving forces behind
biases the root causes of our irrationality are largely unconscious, which
means they remain invisible to self-analysis and impermeable to intelligence.
In fact, introspection can actually compound the error, blinding us to those
primal processes responsible for many of our everyday failings. We spin
eloquent stories, but these stories miss the point. The more we attempt to know
ourselves, the less we actually understand.
Drawing by James Stevenson.
Note: This article has been modified to include mention of Shane Frederick.
https://www.newyorker.com/tech/frontal-cortex/why-smart-people-are-stupid