Why rational decisions are so difficult and so rare

Click to enlarge.

Most thinking goes on below the level of the conscious, reasoning mind.   It couldn’t be otherwise.   Human beings couldn’t function if they had to think out the reasons for every action.

The philosopher John Dewey said human actions are determined by impulse, habit and reason.  Our habits control our impulses.   It is only when neither our impulses nor our established habits get us what we want that we start reasoning.  This is how things are.

An experimental psychologist named Daniel Kahneman has devoted his life to studying how this works.   In his book, Thinking, Fast and Slow (2011), he summarized what he and other psychologists have discovered about the interplay of intuition and reason in decision-making.

What’s noteworthy about the book is that it is based on real science.  Every assertion in it is backed up by a study, many of them by Kahneman himself and his friend,  the late Amos Tversky.

Our default mode of thinking is what Kahneman calls “fast thinking,” or System 1.  It consists of the mental processes that enabled our prehistoric ancestors to react quickly, and to survive.   It is the human mind’s default state.

“Slow thinking”, or System 2, is the override system, comparable to taking conscious control of your breathing.   It requires continuous concentration and effort.  Doing it is hard work.  Some are better at it than  others, but few people can sustain it for long.

System 1 consists of pattern recognition.  The human mind is constantly monitoring the present state of things and matching it with previous experiences and impressions.

This works well for people with long experience of doing similar things, and receiving immediate feedback.    If a firefighter in a burning building or an anesthesiologist in an operating room says something doesn’t seem right, you’d better heed them, because their intuition is grounded in long experience of burning buildings and operating rooms.  Over time, chess players, performing artists and emergency room nurses develop reliable intuition.

The problem is that intuition will give you an answer whether there is any basis for it or not.   Political pundits, stock market analysts and clinical psychologists typically have poor records of predicting results, but this seldom affects their self-confidence.

Human beings would be paralyzed if we had to think of logical reasons for every decision and exercise conscious control over every action.   We need intuition.  But intuition can mislead us.  Kahneman’s book is about ways this happens.

Thinking, Fast and Slow is an extremely rich book.  Almost every chapter could be expanded into a self-help book, while some could be textbooks on negotiations, advertising and propaganda.

I’ve had a hard time getting started on writing about the book, maybe just because there is so much in it.   I’ve given up on trying to give an overview.  I will just hit a few highlights in the hope that I can spark interest in reading it.

One problem with intuitive thinking is the planning illusion.   Those who plan projects typically try to factor in everything they can foresee that is likely to go wrong.   It is predictable that they can’t foresee everything that can go wrong.  That’s why home remodeling contractors and military suppliers make most of their money on change orders.

Kahneman, who grew up in Israel, once talked the Israeli Ministry of Education into commissioning a high school textbook on judgment and decision-making.  He assembled a team, did some preliminary work, and then questioned Seymour, his curriculum expert.

What was the failure rate of people who wrote textbooks from scratch?  Answer: About 40 percent.   Question: How long did it take the others to complete their work?  Answer: Six to ten years.  Question:  Are we better than the other teams?  Answer: No, but we’re not that bad.

Nevertheless, he let the team go ahead.   The textbook took about eight years to complete, and by that time, the Israeli government had lost interest.

The lesson is that, if you are planning a project, you should look at the success rate of those who have attempted similar projects.   Then you should use that as a reference group and determine what makes your project different from the others.

Most entrepreneurs don’t do this, Kahneman said.  This is probably good for society, because the public benefits from their effort, while the entrepreneurs and their backers absorb the loss.   But if you’re an entrepreneur yourself, you’re better off looking before you leap.

Overconfidence is based on what Kahneman calls the WYSIATI (what you see is all there is) syndrome.   As a young Israeli military officer, Kahneman once had the task of setting up an interview system to determine the fitness of recruits for different branches of military service.  Interviews were then conducted by women soldiers who made recommendations based on their personal judgment.  The results were little better than random.

Kahneman drew up a questionnaire, based on factual factors relevant to such characteristics as responsibility, sociability and masculine pride.  These included number of jobs held, work history, participation in sports and so on.   He asked the interviewers to rate the recruits on these individual factors, without regard to the overall recommendation, and then he used this to give a weighted score.

But when they balked at this, he added another step—to shut their eyes and imagine how well the recruit would do in the military.   The conclusions based on the questionnaire were imperfect, but better than the previous subjective evaluations.  Surprisingly, so was the “shut your eyes” judgment, maybe because the intuition included actual information.

In general, Kahneman said, decisions based on checklists and algorithms have a better track record than decisions based on personal judgment.   Simple checklists and algorithms work better than complex formulas.

(I have to say I have reservations about checklists and algorithms.  They are only as good as the people drawing them up.   The “garbage in, garbage out” principle applies, as does Goodhart’s Law.)

Most human beings are loss averse.  They’d rather have the certainty of a small gain than an uncertain chance to make a big gain.  But on the other hand, if faced with a loss, they’d risk a bigger loss to avoid it.

The average person, according to Kahneman, would only risk a loss of $100 if he had an equal chance of winning $200.

This makes people vulnerable to framing effects.   You can get people to make different decisions based on the same facts, depending on whether you framed the decision as seeking a gain or risking a loss.

For example, test subjects were asked what they’d do if they were given $1,000, then offered a choice between a sure gain of an added $500 and a 50/50 chance of getting either $1,000 more or nothing.  Most chose the $500.

Other test subjects were asked what they’d do if they were given $2,000, and then offered a choice between a sure loss of $500 or a 50/50 chance of losing either $1,000 or nothing.  Most chose the gamble.

Both sets of choices are the same—just framed differently.   The framing determines the response.

Credit card companies understand this.  In some states, when stores charge different prices depending on whether you buy with cash or credit cards, they’re required by law to say it is a “cash discount,” not a “credit card surcharge.”   People are more willing to pass up a cash discount than to pay a credit card surcharge, even though the only difference is the words.

The subconscious, intuitive mind can be swayed by anchoring on irrelevant facts and impressions.  We’re at risk of being intentionally primed to think certain ways without our knowing it.   This is so important I intend to devote a separate post to this.

Daniel Kahneman, although a psychologist, won the Nobel Memorial Prize for Economics in 2002.  That shouldn’t be surprising, because economics is a sub-set of psychology.  Economics is the study of decision-making, because on the hypothesis that human action is determined by incentives, especially material incentives.  A section of Thinking, Fast and Slow, is devoted to showing why that isn’t always so.

He and his friend and collaborator, the late Amos Tversky, were true scientists.  Every assertion in this book is backed up by a study of experiment, many conducted by Kahneman and Tversky.   The appendix includes two of their most important scientific papers.

The business journalist Michael Lewis has written a book, The Undoing Project, about their collaboration, which I haven’t read, but I’m pretty sure is good.

Among psychologists, Kahneman and Tversky are at one end of a spectrum and Jordan Peterson, whom I posted about last week, is at the other.   They were scientists.  Peterson is like an artist or guru.  He draws on literature, philosophy and mythology, and personal experience, for his insights.  His work is neither better or worse than Thinking, Fast and Slow.  They operate on different levels.


Daniel Kahneman: “We’re Beautiful Devices” by Oliver Burkeman for The Guardian.

The Friendship That Created Behavioral Economics, an interview with Michael Lewis for The Atlantic about the collaboration between Daniel Kahneman and Amos Tversky.

How to Dispel Your Illusions, a review of Thinking, Fast and Slow by Freeman Dyson for The New York Review of Books.

Tags: , ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: