💾 Archived View for gmi.noulin.net › mobileNews › 5107.gmi captured on 2023-09-08 at 17:48:29. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2023-01-29)

➡️ Next capture (2024-05-10)

-=-=-=-=-=-=-

The best way to win an argument

2014-05-26 13:21:27

Tom Stafford

How do you change someone s mind if you think you are right and they are wrong?

Psychology reveals the last thing to do is the tactic we usually resort to.

You are, I'm afraid to say, mistaken. The position you are taking makes no

logical sense. Just listen up and I'll be more than happy to elaborate on the

many, many reasons why I'm right and you are wrong. Are you feeling ready to be

convinced?

Whether the subject is climate change, the Middle East or forthcoming holiday

plans, this is the approach many of us adopt when we try to convince others to

change their minds. It's also an approach that, more often than not, leads to

the person on the receiving end hardening their existing position. Fortunately

research suggests there is a better way one that involves more listening, and

less trying to bludgeon your opponent into submission.

A little over a decade ago Leonid Rozenblit and Frank Keil from Yale University

suggested that in many instances people believe they understand how something

works when in fact their understanding is superficial at best. They called this

phenomenon "the illusion of explanatory depth". They began by asking their

study participants to rate how well they understood how things like flushing

toilets, car speedometers and sewing machines worked, before asking them to

explain what they understood and then answer questions on it. The effect they

revealed was that, on average, people in the experiment rated their

understanding as much worse after it had been put to the test.

What happens, argued the researchers, is that we mistake our familiarity with

these things for the belief that we have a detailed understanding of how they

work. Usually, nobody tests us and if we have any questions about them we can

just take a look. Psychologists call this idea that humans have a tendency to

take mental short cuts when making decisions or assessments the "cognitive

miser" theory.

Why would we bother expending the effort to really understand things when we

can get by without doing so? The interesting thing is that we manage to hide

from ourselves exactly how shallow our understanding is.

It's a phenomenon that will be familiar to anyone who has ever had to teach

something. Usually, it only takes the first moments when you start to rehearse

what you'll say to explain a topic, or worse, the first student question, for

you to realise that you don't truly understand it. All over the world, teachers

say to each other "I didn't really understand this until I had to teach it". Or

as researcher and inventor Mark Changizi quipped: "I find that no matter how

badly I teach I still learn something".

Explain yourself

Research published last year on this illusion of understanding shows how the

effect might be used to convince others they are wrong. The research team, led

by Philip Fernbach, of the University of Colorado, reasoned that the phenomenon

might hold as much for political understanding as for things like how toilets

work. Perhaps, they figured, people who have strong political opinions would be

more open to other viewpoints, if asked to explain exactly how they thought the

policy they were advocating would bring about the effects they claimed it

would.

Recruiting a sample of Americans via the internet, they polled participants on

a set of contentious US policy issues, such as imposing sanctions on Iran,

healthcare and approaches to carbon emissions. One group was asked to give

their opinion and then provide reasons for why they held that view. This group

got the opportunity to put their side of the issue, in the same way anyone in

an argument or debate has a chance to argue their case.

Those in the second group did something subtly different. Rather that provide

reasons, they were asked to explain how the policy they were advocating would

work. They were asked to trace, step by step, from start to finish, the causal

path from the policy to the effects it was supposed to have.

The results were clear. People who provided reasons remained as convinced of

their positions as they had been before the experiment. Those who were asked to

provide explanations softened their views, and reported a correspondingly

larger drop in how they rated their understanding of the issues. People who had

previously been strongly for or against carbon emissions trading, for example,

tended to became more moderate ranking themselves as less certain in their

support or opposition to the policy.

So this is something worth bearing in mind next time you're trying to convince

a friend that we should build more nuclear power stations, that the collapse of

capitalism is inevitable, or that dinosaurs co-existed with humans 10,000 years

ago. Just remember, however, there's a chance you might need to be able to

explain precisely why you think you are correct. Otherwise you might end up

being the one who changes their mind.