BLOG

BREAKING DOWN THE PSYCHOLOGY OF DECISION-MAKING

BREAKING DOWN THE PSYCHOLOGY OF DECISION-MAKING

 How mindfulness can help you avoid the common pitfalls of intuitive decision-making
 
 

An excellent framework for understanding how the human mind works is Professor Daniel Kahnmen’s breakdown of the mind into Systems 1 and 2 (See his book, “Thinking Fast and Slow” – buy it here).

System 1 is the automated, intuitive, sub-conscious part of the mind. And System 2 is the rational, conscious mind.

Both are vital to our ability to function. Both have flaws. Your internal computer (System 1) makes mistakes. Your conscious mind (System 2) is lazy and lets System 1 run the controls, without bothering to the check the results. The more tired you are, the more likely you are to leave System 1 in charge.

Allowing System 1 to inform decisions, without the scrutiny of System 2, is a sure-fire way to make poor decisions.

A solution is the use of mindfulness, or present moment awareness, to engage System 2 (your rational conscious mind) during decision-making. By applying reason and thinking through the evidence, you can prevent the flawed intuitive assessments of System 1 (your internal computor) from informing your decisions.

Here are some common mistakes, according to Professor Kahneman, that you can avoid by applying this approach:

  1. Believing System 1’s false confidence

Your internal computer is hard-wired to generate what it perceives to be a coherent picture of the situation in front of it, based on whatever information it has received.

Whether you base it on one input or one thousand inputs, your internal computer will develop a story of what is going on. And it will inform you that you now understand the subject. 

The important thing to note is that the confidence with which you hold an opinion is no reflection of the probability that it is correct. This is because your feeling of understanding the subject does not take into account either the quantity or the quality of the evidence it has received on which it has based its assessment. The confidence with which you hold an opinion may merely be a reflection of the fact that System 1 has done its job in generating a coherent understanding of what it thinks is happening.

Your internal computer has no gauge of poor or insufficient data. It simply seeks coherence, regardless. This leaves you prone to developing strongly held opinions based on flimsy evidence.

Solution: Before you finalise a decision, consider consciously questioning the quantity and quality of the information on which your opinion is based. This will help you screen out any flimsy intuitive assessments you may have made without realizing it. Remember: confidence in your opinion is not necessarily a gauge that you understand the subject.

  1. Falling for phony expertise

There is only one thing worse than falling for the false confidence of your own internal computer. And that is falling for the false confidence of someone else’s internal computer.

If you are using someone else’s opinion to inform your decision, then make sure they are not making the mistake outlined above.

Daniel Kahneman argues that the opinion of an expert can be trusted only when the following criteria are fulfilled:

  1. The event they are offering an opinion on occurs sufficiently regularly that it becomes predictable
  2. The person issuing the prediction has had the opportunity to learn through regular practice in predicting the same event
  3. and that person has been able to learn through immediate feedback on their performance

Solution: Ignore the confidence of expert’s opinions and instead question them to see if the event they are predicting and their experience match the criteria above.

Even better, if your decision is based on a prediction of a future event, validate the expert’s opinion by looking for trends in past events that may predict the outcome of future ones. Accurate data on actual events may be a better predictor of future events than expert opinions, however confidently they are expressed.

And for anyone managing a team: consider providing adequate feedback, in a sufficiently timely fashion, to enable them to learn from their own past decisions.

  1. Mistaking plausibility for probability

We are prone to believing that something that is merely plausible is also likely, purely because of its plausibility.

An example is trusting that the outcome of one event will be positive solely because outcomes of the previous two events were. It’s plausible but not necessarily true. Your internal computer is very good at seeing correlation and believing it reflects causation.

This is because it is constantly scanning for causal arguments, which help generate the picture of coherence that System 1 updates on a rolling basis. You are hard-wired, intuitively, to believe causal arguments.

This is an important one to remember, particularly when making investment decisions. System 1 is prone to over-confidence, so intuitive predictions on the likely outcomes of bets, share prices, company or investment performance should not be trusted by themselves.

Solution: Remember that you are prone to believing causal arguments solely on the basis that they are plausible. Consider that this may not be an effective predictor of whether the scenario is likely.

Firstly, make a note of your instinctive, intuitive response. Remember that this derives from your internal computer, which tends to be over-confident. The probability is that this intuitive prediction is unrealistic, as a result.

By becoming mindful of your thought processes as you make decisions, you can catch yourself before you base a decision on a wildly over-optimistic intuitive assessment.

Use data on what has actually happened in the past to generate a more accurate prediction of future events.

 

For more information on these ideas, watch to Professor Kahneman himself discuss the interplay between System 1 (the internal computor) and System 2 (the rational, conscious mind): https://www.youtube.com/watch?v=CjVQJdIrDJ0

Source for all the ideas in this blog: "Thinking Fast and Slow" by Professor Daniel Kahneman

 

 Image © M.Gove Fotolia

Share this post

LEAVE A REPLY

Make sure you enter the (*) required information where indicated. HTML code is not allowed.

Comments

 
No comments yet
Already Registered? Login Here
Guest
Monday, 15 October 2018
If you'd like to register, please fill in the username and name fields.

Captcha Image

clearmind-sleep-shop

Need help sleeping?

 

Download Sue Beer’s guided hypnotherapy to help push you off to sleep.

Buy on iTunes Buy on Amazon
clearmind-stress-shop

Want help de-stressing?

 

Download Sue Beer’s soothing guided hypnotherapy to calm and relax your mind.

Buy on iTunes Buy on Amazon

SIGN UP TO OUR NEWSLETTER

Please leave us your email address so we can keep you updated
on our latest events, promotions and training opportunities.

We promise not to bombard you with daily information but will just aim
to keep you posted on the really important stuff!

 

JOIN US

clearmind-facebook clearmind-twitter clearmind-google
 

Hello and welcome to the Clearmind Company

Please leave us your email address so we can
keep you updated on our latest events, promotions
and training opportunities.

We promise not to bombard you with daily
information but will just aim to keep you posted on the
really important stuff!