💾 Archived View for gmi.noulin.net › mobileNews › 5397.gmi captured on 2023-06-14 at 15:19:57. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2023-01-29)

➡️ Next capture (2024-05-10)

-=-=-=-=-=-=-

Leaders as Decision Architects

2015-05-05 14:45:11

John BeshearsFrancesca Gino

All employees, from CEOs to frontline workers, commit preventable mistakes: We

underestimate how long it will take to finish a task, overlook or ignore

information that reveals a flaw in our planning, or fail to take advantage of

company benefits that are in our best interests. It s extraordinarily difficult

to rewire the human brain to undo the patterns that lead to such mistakes. But

there is another approach: Alter the environment in which decisions are made so

that people are more likely to make choices that lead to good outcomes.

Leaders can do this by acting as architects. Drawing on our extensive research

in the consulting, software, entertainment, health care, pharmaceutical,

manufacturing, banking, retail, and food industries and on the basic principles

of behavioral economics, we have developed an approach for structuring work to

encourage good decision making.

Our approach consists of five basic steps: (1) Understand the systematic errors

in decision making that can occur, (2) determine whether behavioral issues are

at the heart of the poor decisions in question, (3) pinpoint the specific

underlying causes, (4) redesign the decision-making context to mitigate the

negative impacts of biases and inadequate motivation, and (5) rigorously test

the solution. This process can be applied to a wide range of problems, from

high employee turnover to missed deadlines to poor strategic decisions.

Understand How Decisions Are Made

For decades, behavioral decision researchers and psychologists have suggested

that human beings have two modes of processing information and making

decisions. The first, System 1 thinking, is automatic, instinctive, and

emotional. It relies on mental shortcuts that generate intuitive answers to

problems as they arise. The second, System 2, is slow, logical, and deliberate.

(Daniel Kahneman, winner of the Nobel prize in economics, popularized this

terminology in his book Thinking, Fast and Slow.

Each of the two modes of thinking has distinct advantages and disadvantages. In

many cases, System 1 takes in information and reaches correct conclusions

nearly effortlessly using intuition and rules of thumb. Of course, these

shortcuts can lead us astray. So we rely on our methodical System 2 thinking to

tell us when our intuition is wrong or our emotions have clouded our judgment,

and to correct poor snap judgments. All too often, though, we allow our

intuitions or emotions to go unchecked by analysis and deliberation, resulting

in poor decisions. (For a look at how both modes of thinking can cause

problems, see Outsmart Your Own Biases. )

Overreliance on System 1 thinking has another negative effect: It leads to poor

follow-through on plans, despite people s best intentions and genuine desire to

achieve their goals. That s because System 1 tends to focus on concrete,

immediate payoffs, distracting us from the abstract, long-term consequences of

our decisions. For instance, employees know they should save for retirement,

yet they rarely get around to signing up for their 401(k) plans. (A survey

conducted in 2014 by TIAA-CREF found that Americans devote more time to

choosing a TV or the location for a birthday dinner than to setting up a

retirement account.)

We do not mean to suggest that System 1 should be entirely suppressed in order

to promote sound decisions. The intuitive reactions of System 1 serve as

important inputs in the decision-making process. For example, if an investment

opportunity triggers a fearful emotional response, the decision maker should

carefully consider whether the investment is too risky. Using System 2, the

emotional response should be weighed against other factors that may be

underappreciated by System 1 such as the long-term strategic value of the

investment.

Engaging System 2 requires exerting cognitive effort, which is a scarce

resource; there s simply not enough of it to govern all the decisions we re

called on to make. As the cognitive energy needed to exercise System 2 is

depleted, problems of bias and inadequate motivation may arise.

Define the Problem

Not every business problem should be tackled using behavioral economics tools.

So before applying them, managers should determine whether:

Human behavior is at the core of the problem.

Certain problems employee burnout, for instance can be resolved by changing the

way people perceive and respond to a situation. Others are fundamentally

technological in nature for example, the lack of scientific knowledge needed to

create a new drug for treating a disease. Those problems are unlikely to be

solved by applying behavioral economics tools unless addressing them involves

changing human behavior (for example, encouraging teams of scientists to share

their discoveries in order to develop the drug).

People are acting in ways contrary to their own best interests.

Most behavioral economics tools gently guide people to different choices. They

will be most effective in situations where they encourage people to switch from

choices that are contrary to their interests to those better aligned with them.

The problem can be narrowly defined.

Sometimes all-encompassing change is required to shake up an organization. But

in many instances, complex organizational problems can be broken down into

smaller, more manageable pieces.

Consider a large U.S. retailer s efforts to rein in health care costs without

adversely impacting employees health, which one of us (John) studied in

collaboration with James Choi, David Laibson, and Brigitte Madrian. The company

identified one piece of the problem: the high cost of the subsidies it paid for

employees prescription drugs. Working with the drug plan administrator, the

retailer narrowed the problem further and focused on encouraging employees to

switch from picking up their prescriptions at pharmacies to having them mailed

to their homes. That shift would save both the company and employees money,

because prescriptions can be processed more cheaply at a large distribution

facility.

Behavioral economics techniques were appropriate in this case (we ll describe

later which ones the retailer used) because the problem was narrowly defined

and involved employees not acting in their own best interests: Pharmacy pickup

was less convenient than home delivery, more expensive, riskier (the error rate

in filling mail-order prescriptions is lower), and made employees more prone to

lapses in their treatment plan.

Diagnose Underlying Causes

There are two main causes of poor decision making: insufficient motivation and

cognitive biases. To determine which is causing the problematic behavior,

companies should ask two questions: First, is the problem caused by people s

failure to take any action at all? If so, the cause is a lack of motivation.

Second, are people taking action but in a way that introduces systematic errors

into the decision-making process? If so, the problem is rooted in cognitive

biases. These categories are not mutually exclusive, but recognizing the

distinction between them is a useful starting point.

Common Biases that Affect Business Decisions

Many cognitive biases impair our ability to objectively evaluate information,

form sound judgments,

and make effective decisions. These biases can be particularly problematic in

business context.

ACTION-ORIENTED BIASES

Excessive optimism: We are overly optimistic about the outcome of planned

actions. We overestimate

the likelihoud of positive events and underestimate that of negative ones.

Overconfidence: We overestimate our skill level relative to others' and

consequently our abiliy to affect

future outcomes. We take credit for past positive outcomes without

acknowledging the role of chance.

BIASES RELATED TO PERCEIVING AND JUDGING ALTERNATIVES

Confirmaition bias: We place extra value on evidence consistent with a favored

belief and not enough

on evidence that contradicts it. We fail to search impartially for evidence.

Anchoring and insufficient adjustment: We root our decisions in an in initial

value and fail

to sufficiently adjut our thinking away from that value.

Groupthink: We strive for consersus at the cost of a realistic appraisal of

alternative courses of action.

Egocentrism: We focus too narrowly on our own perpective to the point that we

can't imagine how others

will be affected by a policy or stategy. We assume that everyone has access to

the same information we do.

BIASES RELATED TO THE FRAMING OF ALTERNATIVES

Loss aversion: We feel losses more acutely than gains of the same amount, which

makes us more

risk-averse than a rational calculation would recommend.

Sunk-cost fallacy: We pay attetjon To historical costs that are not recoverable

when considering

future courses of action.

Escalation of commitment: We invest additional resources in an apparently

losing proposition

because of tche effort, money, and time already invested.

Controllability bias: We believe we can control outcomes more than is actually

the case, causing

us to misjudge the riskiness of a course of action.

STABILITY BIASES

Status quo bias: We prefer the status quo in the absence of pressure to change

it.

Present bias: We value immediate rewards very highly and undervalue long-term

gains.

Because problems of motivation and cognition often occur when System 2 thinking

fails to kick in, the next step is to ascertain which aspect of the situation

caused System 1 to weigh the trade-offs among available options incorrectly and

what prevented System 2 from engaging and correcting the mistake. Common sense

can go a long way in diagnosing underlying causes. Put yourself in the shoes of

the person making the decision (or failing to make a decision) and ask, What

would I do in this situation and why?

At the retailer that wished to reduce health care costs, lack of motivation was

preventing employees from switching to home delivery for prescriptions. When

management asked them directly about the advantages and disadvantages of home

delivery, many expressed a preference for it yet only 6% of employees who

regularly took maintenance medications (such as statins for high cholesterol)

got around to signing up for it. Simple inertia kept them from picking up the

phone, enrolling online, or mailing in a form.

Wipro BPO, a division of the business-process outsourcing firm Wipro, faced a

different kind of motivation problem. Many of its employees were burning out

and quitting after only a few months on the job. To find out why, one of us

(Francesca), together with Daniel Cable and Bradley Staats, interviewed

employees and observed their behavior. The problem lay with the division s

onboarding process, which was focused on indoctrinating new employees into the

company s culture. The training failed to build an emotional bond between new

hires and the organization and caused them to view the relationship as

transactional rather than personal. Because they were disengaged and

demotivated, the stresses of the job dealing with frustrated customers, the

rigid scripts they had to use, and so on got to them, causing them to leave the

company just a few months after joining.

Design the Solution

Once they ve diagnosed the underlying source of a problem, companies can begin

to design a solution. In particular, managers can use choice architecture and

nudges, concepts introduced by Richard Thaler and Cass Sunstein in their 2008

book Nudge: Improving Decisions About Health, Wealth, and Happiness. The goal

of choice architecture is to improve people s decisions by carefully

structuring how information and options are presented to them. In this fashion,

companies can nudge employees in a certain direction without taking away their

freedom to make decisions for themselves.

Public-policy makers are increasingly using choice architecture tools to nudge

people toward better decisions on issues such as tax payments, medical

treatments, consumer health and wellness, and climate-change mitigation. And

businesses are starting to follow suit. For example, Google implemented choice

architecture in its cafeterias in an effort to get employees to adopt more

healthful eating habits. As Googlers reach for a plate, they encounter a sign

informing them that people who use bigger plates tend to eat more than those

who use smaller plates. Thanks to this simple change, the proportion of people

using small plates has increased by 50%.

Adjustments to the choice environment can drive big improvements at low or even

no cost. They include simply varying the order in which alternatives are

presented, altering the wording used to describe them, adjusting the process by

which they are selected, and carefully choosing defaults.

Here s a classic example: For many years, U.S. companies offered opt-in

retirement savings plans. Employees who did not actively sign up were not

enrolled. More recently, companies have been automatically enrolling their

employees. Under this opt-out system, employees have a fraction of each

paycheck (say, 6%) contributed to the plan unless they actively choose

otherwise. A collection of studies by one of us (John), with James Choi, David

Laibson, and Brigitte Madrian, found that on average only half the workers at

companies with opt-in systems join their plan by the time they ve been employed

at the firm for one year. Automatic enrollment generates participation rates of

90% or higher. In changing the default, firms altered neither the menu of

options available nor the financial incentives for enrollment. They simply

changed the consequences of refraining from actively indicating one s

preferences.

Choice architecture is more effective in improving employees decisions than

widely used approaches such as educating individuals or offering monetary

incentives (see When Economic Incentives Backfire, HBR, March 2009). The

reason: Those methods rely on individuals acting in their self-interest, which

people often fail to do. They also attempt to fundamentally change the way

employees process information and make decisions, which is difficult to

accomplish. The following levers can help companies take advantage of the

enormous potential of choice architecture to improve decision making.

Trigger System 1.

The emotions and biases that accompany System 1 thinking often wreak havoc, but

they can be tapped for productive purposes. Executives can trigger System 1 in

several ways:

Arouse emotions.

Let s return to the Wipro BPO example. In a bid to reduce the high turnover at

its call centers, the organization in collaboration with one of us (Francesca),

Dan Cable, and Brad Staats conducted an experiment aimed at strengthening

employees emotional connection with the organization. It divided new hires

into two groups: In one, the employees were asked on the first day of

orientation to think about their strengths and how they could apply them in

their new jobs. In the control group, the employees were not given an

opportunity for self-reflection. The approach, which Wipro BPO adopted, helped

new employees to feel they could be themselves at work. The resulting emotional

bond with the organization led not only to lower employee turnover but also to

higher performance as measured by customer satisfaction. We have achieved

similar results in other organizations.

Harness biases.

Executives can also use cognitive biases to their advantage. For example,

research shows that people feel twice as bad about incurring a loss as they

feel good about receiving a gain of the same amount (a bias known as loss

aversion) and that people pay extra attention to vivid information and overlook

less flashy data (known as vividness bias). Work conducted by the Behavioral

Insights Team (BIT), an organization set up to apply nudges to improve

government services, demonstrates this. BIT collaborated with the UK Driver and

Vehicle Licensing Agency to reduce the number of people delinquent in paying

their vehicle taxes. To trigger System 1 thinking, a new notification letter

was written in plain English along the lines of Pay your tax or lose your car

a departure from the complex legal language used in the original letter. To

make the demand more personal, some letters included a photo of the car in

question. The rewritten letters alone and those with the photo increased the

number of people who paid their taxes by 6% and 20%, respectively.

Organizations can also highlight the downside of failing to take action to

motivate weak performers. For instance, it s well known that having a

high-quality pipeline of new sales talent is an effective way to get

underperforming salespeople to improve their performance. This so-called man

on the bench effect makes vivid the possibility that they could lose their

jobs or bonuses, motivating them to work harder. Studies have found that

salespeople in districts with a bench player perform about 5% better than those

in districts without one. In the long run, the overall increase in revenue

outweighs the costs associated with hiring bench players.

Simplify the process.

Organizational processes often involve unnecessary steps that lower motivation

or increase the potential for cognitive biases. By streamlining processes,

executives can reduce such problems. At a health care center that one of us

(Francesca) worked with, the doctors had to use different IT systems across

departments to input patient information, which was then used to make decisions

about patient care. The hospital introduced a centralized system that allows a

doctor to see all of a patient s historical and personal information,

regardless of what department the patient visited in the past. As a result, the

doctors are much more motivated to keep the information up-to-date and to use

the system.

Engage System 2.

Executives have a range of options they can use to encourage greater

deliberation and analysis in decision making.

Use joint, rather than separate, evaluations.

Evaluating decision alternatives simultaneously, rather than sequentially,

reduces bias. For instance, a manager who is evaluating job candidates can

avoid making biased assessments of their likely future performance by comparing

them against one another rather than evaluating them separately. That s because

joint evaluation nudges employers to focus more on employees past performance

and less on gender and implicit stereotypes, as research Iris Bohnet, Alexandra

van Geen, and Max Bazerman shows. Managers often use joint evaluations in

initial hiring decisions, especially at lower levels, but they rarely take

advantage of this approach when considering employees for job assignments and

promotions. It can be helpful in many situations, such as choosing which

products to advance in the development process, evaluating investment

alternatives, and setting strategic direction.

Create opportunities for reflection.

Taking time out of our busy days to just think may sound costly, but it is an

effective way to engage System 2. Let s return to the example of the retailer

that wanted its employees to use home delivery for their medical prescriptions.

The firm told employees that in order to take advantage of their prescription

drug benefit, they had to make an active choice (by phone, web, or mail)

between home delivery and pick-up at a pharmacy. In doing so, the company

forced employees to reflect and make a decision. When the active choice program

was introduced, the percentage of employees taking long-term medications who

opted for home delivery increased more than sixfold. This generated a savings

of approximately $1 million, which was split roughly equally between employees

and the retailer.

Encouraging reflection can also help in training and employee development. One

of us (Francesca) conducted an experiment at a Bangalore call center with

colleagues Giada Di Stefano, Brad Staats, and Gary Pisano. Three groups of

employees were given the same technical training with a couple of key

differences. Workers in one group spent the last 15 minutes of certain days

reflecting (in writing) on what they d learned. Employees in another group did

the same, and then spent an additional five minutes explaining their notes to a

fellow trainee. People in the control group just kept working at the end of the

day. In a test given after the training program, employees in the first and

second groups performed 22.8% and 25% better, respectively, than those in the

control group, despite having spent less time working. We found that reflection

had a similarly beneficial impact on employees on-the-job performance.

Use planning prompts.

People often resolve to act in a particular way but forget or fail to follow

through. Simple prompts can help employees stick to the plan. In a study one of

us (John) conducted with Katherine Milkman, James Choi, David Laibson, and

Brigitte Madrian, we mailed letters to the employees of a midwestern utility

about the company s flu shot clinics, describing the benefits of flu shots as

well as the times and clinic locations. Some of the letters included blank

spaces for recipients to fill in with the time they would go to a clinic.

Merely prompting them to form plans by jotting down a time, even though they

were not actually scheduling an appointment, caused them to briefly engage

System 2, increasing the number of employees who got the shots by 13%.

A similar technique can be used to improve team performance. Many team efforts,

particularly those that fail to meet objectives, end with a vow to do better

next time. Unfortunately, such vague promises do nothing to prevent teams from

making the same mistakes again. A leader can help teams follow through on

resolutions by having members create clear maps for reaching their goals that

detail the when and the how.

Inspire broader thinking.

We commonly approach problems by asking ourselves, What should I do? Asking

What could I do? helps us recognize alternatives to the choice we are facing,

thus reducing bias in the evaluation of the problem and in the final decision.

But companies generally fail to broaden their perspectives in this way. In an

analysis of more than 160 decisions made by businesses over the years,

management scholar Paul Nutt found that 71% of them had been framed in terms of

whether or not an organization or a person should take a certain course of

action. That kind of framing often leads decision makers to consider only one

alternative: the course of action being discussed. A simple change in language

using could rather than should helps us think past the black and white and

consider the shades of gray. It also allows us to consider solutions to ethical

dilemmas that move beyond selecting one option over another.

How to U_e Choice Architecture to Improve 0eci_ion_

Executives can mitigate the effects of bias on decision making and motivate

employees and customers

to make choices that are in both the organization's and their own best

interests. Here's how.

1. UNDERSTAND HOW DECISIONS ARE MADE

Human beings have two modes of processing

information and making decisions:

- System 1 is automatic, instinctive, and emotional.

- System 2 is slow, logical, and deliberate.

2. DEFINE THE PROBLEM

Behavioral economics tools are most effetive when:

- Human behavior is at the core of the problem.

- People are not acting in their own best interests.

- The problem can be narrowly defined.

3. DIAGNOSE THE UNDERLYING CAUSES

To determine whether poor decision making

is a result of insuficient motivation or of cognitive

biases, ask two questions:

- Is the problem caused by people's failure

to take any action at all?

- Do people take actcion, but in a way that

introduces systematic errors into the

decision-making process?

4. DESIGN THE SOLUTION

Use one of three levers:

- Trigger System 1 thinking by introducing

changes that arouse emotions, harness bias,

or simplify processes.

- Engage System 2 thinking by using joint

evaluations, creating opportunities for

reflection, increasing accountability, and

intrroducing reminders and planning prompts.

- Bypass both systems by setting defaults

and building in automatic adjustments.

5. TEST THE SOLUTION

Rigorously test the proposed solution to avoid

costly mistakes:

- Identify a target outcome that

is specific and measurable.

- Identify a range of possible solutions

and then focus on one.

- Introduce the change in some areas of the

organisation (the 'treatcment group' and not

others (the 'control group').

Increase accountability.

Holding individuals accountable for their judgments and actions increases the

likelihood that they will be vigilant about eliminating bias from their

decision making. For example, a study of federal government data on 708

private-sector companies by Alexandra Kalev and colleagues found that efforts

to reduce bias through diversity training and evaluations were the least

effective ways to increase the proportion of women in management. Establishing

clear responsibility for diversity (by creating diversity committees and staff

positions, for example) was more effective and led to increases in the number

of women in management positions.

Encourage the consideration of disconfirming evidence.

When we think that a particular course of action is correct, our tendency is to

interpret any available information as supporting that thinking. This is known

as confirmation bias. Furthermore, once we invest resources in a course of

action, we tend to justify those investments by continuing down that path, even

when new information suggests that doing so is unwise a phenomenon known as

escalation of commitment. Together, these biases lead decision makers to

discount contradictory evidence and to ignore the possibility of superior

alternatives. Organizations can solve this problem by actively encouraging

counterfactual thinking (asking How might events have unfolded had we taken a

different course of action? ) and making sure that employees consider

disconfirming evidence. In situations where a group is making decisions, the

leader might assign one member to ask the tough questions and look for evidence

that reveals flaws in the planned course of action. (For more details on how to

do this effectively, see Making Dumb Groups Smarter, HBR, November 2014.)

Alternatively, the leader may ask function heads to rotate their roles to get a

fresh perspective, as auditors at accounting firms, credit officers at banks,

and board members serving on committees frequently do. People who are in charge

of one domain for a long time tend to irrationally escalate their commitment to

the established way of doing things; newcomers are more likely to notice

evidence that a different course of action would be wiser. Furthermore, the

knowledge that a rotation will bring in a new set of eyes to scrutinize past

decisions encourages people to make more-disciplined choices.

Use reminders.

Reminders are an effective way to engage System 2, helping us avoid the biases

that come from relying too much on System 1. Reminders also serve to highlight

goals we want to accomplish (for instance, finishing a presentation on time),

thus increasing our motivation. One of us (Francesca) and colleagues

collaborated with an automobile insurance company to use reminders to reduce

customer dishonesty. As part of the study, the company sent 13,488 customers a

form that asked them to report how many miles they had driven that year as

indicated on their cars odometers. The lower the reported mileage, the lower

the insurance premium tempting customers to underreport how much they had

driven. Half the customers were asked to sign a statement at the bottom of the

form that they were being truthful. The other half were asked to sign the same

statement at the top of the form. Customers who signed at the top reported an

average of 2,400 miles more than those who signed at the bottom, which

suggested that the reason for the difference was not driving habits but the

reminder before they filled out the form of a goal they care about (being

honest).

Ask What could I do? rather than What should I do?

Consider another example of how reminders trigger System 2 thinking. In his

book The Checklist Manifesto, surgeon and journalist Atul Gawande describes how

he introduced a surgery checklist to eight hospitals in 2008. Surgeons, nurses,

and other personnel systematically went through the checklist before performing

each surgery to remind themselves of the steps involved in the procedure. One

study that measured the checklist s effectiveness found that the new practice

resulted in 36% fewer major complications and 47% fewer deaths.

Bypass both systems.

The third approach that organizations can use to avoid biases and lack of

motivation is to create processes that automatically skirt System 1 and System

2.

Set the default.

Changing the default for standard processes automatically enrolling employees

in a retirement plan, for instance can have a powerful impact on ultimate

outcomes, especially when decisions are complex or difficult. At Motorola, for

example, employees who have previously worked on one product team may not join

another team working on a similar product. This rule is set as the default and

allows new teams to develop their own opinions without being affected by other

teams.

Build in automatic adjustments.

Another effective way to counter cognitive biases is to build in adjustments

that account for poor System 1 and System 2 thinking. Managers at Microsoft,

for example, figured out that programmers vastly underestimate how long it will

take them to complete tasks a common cognitive bias called the planning

fallacy. Microsoft s solution: Add buffer time to projects. Managers examined

historical data on project delays and came up with guidelines. Timelines for

updates to applications such as Excel and Word, for example, receive a buffer

equal to 30% of the schedule. For more complex projects, such as operating

systems, timelines get a 50% buffer.

How to Choose the Right Lever

We recommend that companies first consider bypassing both systems so that the

desired outcome is implemented automatically. Because this strategy requires no

effort on the part of decision makers, it is the most powerful way to influence

results.

For many reasons, however, this approach may not be feasible or desirable. It

may be impossible or prohibitively costly to automate the process in question.

The targeted individuals may resent having the choice made for them. Or a one

size fits all approach may be inappropriate.

Test Yourself: Are You Being Tricked by Intuition?

by John Beshears, Shane Frederick, and Francesca Gino

What's your default mode for judgments and decisions? To find out, take this

(very short) cognitive-reflection test, which was created by Shane Frederick at

Yale and originally appeared in The Journal of Economic Perspectives. At the

end, you ll receive feedback on your answers and gain insight into how you

arrived at them.

1. A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the

ball. How much does the ball cost?

2. If it takes five machines five minutes to make five widgets, how many

minutes would it take 100 machines to make 100 widgets?

3. In a lake, there is a patch of lily pads. Every day, the patch doubles in

size. If it takes 48 days for the patch to cover the entire lake, how many days

would it take for the patch to cover half the lake?

Consider the case of a bank that must decide whether to renew loans to small

businesses. It could automate the renewal decision using information from the

businesses balance sheets and cash flows. However, the bank may make better

lending decisions if loan officers familiar with the businesses have discretion

over whether to renew loans. Even if two businesses appear identical in the

bank s computer systems, the loan officers may be aware of other factors for

instance, changes in the management team that make one a higher risk than the

other. Of course, giving loan officers discretion introduces biases into the

decision-making process a potential cost that must be weighed.

It s hard to change the way people s brains are wired. So change the context

for decisions instead.

If bypassing both systems is not an option, companies must choose whether to

trigger System 1 or engage System 2. The deliberative approach of System 2 can

override mistakes caused by System 1, but cognitive effort is a limited

resource. Using it for one decision means that it may not be available for

others, and this cost must be taken into account. For example, in a study of

fundraising efforts conducted at a U.S. public university by one of us

(Francesca) with Adam Grant, the performance of fundraisers improved

significantly when the director thanked them for their work. This intervention

strengthened their feelings of social worth by triggering System 1. One can

imagine interventions that would engage System 2 for instance, asking the

fundraisers to take more time to prepare for each call or increasing their

accountability for results. However, such interventions might drain their

energy and cognitive resources, diminishing their effort and persistence.

Test the Solution

The final step is to rigorously test the proposed solution to determine whether

it will accomplish its objectives. Testing can help managers avoid costly

mistakes and provide insights that lead to even better solutions. Tests should

have three key elements:

Identify the desired outcome.

The outcome should be specific and measurable. In the case of the retailer that

wanted employees to use home delivery for prescriptions, it was clear:

increasing the percentage of employees who signed up for home delivery.

Identify possible solutions and focus on one.

If you alter too many things at once, it will be difficult to determine which

piece of a complex change produced the desired effect. To avoid this problem,

the retailer rolled out its active choice prescription program without

simultaneously implementing other changes.

Introduce the change in some areas of the organization (the treatment group )

and not others (the control group ).

If possible, divide the individuals, teams, or other entities randomly into two

groups. Randomization helps ensure that any differences in outcome between the

two groups can be attributed to the change. When such simple randomization is

not feasible for reasons of logistics, ethics, cost, or sample size,

more-sophisticated analytical techniques can be employed. (For a more detailed

explanation of how to conduct rigorous business experiments, see The

Discipline of Business Experimentation, HBR, December 2014.)

Insidious biases and insufficient motivation are often the main drivers behind

significant organizational problems. But it s extremely difficult to change the

way people s brains are wired. Instead change the environment in which people

make decisions. Through some simple adjustments, executives can produce

powerful benefits for their employees and organizations.

A version of this article appeared in the May 2015 issue (pp.52-62) of Harvard

Business Review.

John Beshears is an assistant professor at Harvard Business School and a

faculty affiliate of the Harvard Kennedy School of Government s Behavioral

Insights Group. He cochairs an HBS executive education program on applying

behavioral economics to organizational problems.

Francesca Gino is a professor at Harvard Business School, a faculty affiliate

of the Behavioral Insights Group, and the author of Sidetracked: Why Our

Decisions Get Derailed, and How We Can Stick to the Plan (Harvard Business

Review Press, 2013). She cochairs an HBS executive education program on

applying behavioral economics to organizational problems. Follow her on Twitter

@francescagino.