The Bright and the Blind at Davos 2013

The Bright and the Blind at Davos 2013

Why maybe we should be worried about decisions being made at Davos.

Many of the great and good who gathered in the conference halls at Davos for the 2013 World Economic Forum last week would no doubt be pleased to learn that the behavioural sciences have shown that the smarter we are and the higher our IQs, the less likely we are to be affected by the heuristics and cognitive biases which can cause us to make irrational decisions.

However, even the brightest minds aren't entirely immune. Daniel Kahneman - Nobel prizewinner, psychologist and one of the fathers of the behavioural sciences - is humble about his own (ir)rationality. On the opening morning at Davos, he talked about the science behind human decision-making and cognitive biases, outlining some of the powerful ideas, theories and concepts developed with his colleague Amos Tversky.[1]  After over 40 years of studying the subject, however, he says that even he is not immune to bias: "My intuitive thinking is just as prone to overconfidence, extreme predictions and the planning fallacy as it was before I made a study of these issues."  He has also remarked on the fact that many of his academic peers (particularly the economists amongst them) regularly fall for the very same biases.

How come even the brightest don't have immunity to biases?

So why is it that bright people like Kahneman and Tversky  - and perhaps many of the Davos delegates - are less biased and yet not totally immune? A simple three question test has revealed some answers. Shane Frederick, a Professor of Marketing at Yale, has developed one of the most informative and accurate predictors of susceptibility to heuristics and cognitive biases in the form of the Cognitive Reflection Task (CRT) which measures to what extent we apply logical reasoning to our intuitions and gut feelings. He found that there are wide variations in our degree of susceptibility and overall our performance is pretty poor. Out of 3,500 people tested, a mix of US students and other citizens, only 17% answered all three questions correctly and 33% scored a big fat zero. (For the questions see Frederick, 2005).[2]

But the CRT is to some degree correlated with intelligence.[3]  Students from top university MIT performed much better: 48% got all three questions right, but that still leaves 52% of super intelligent MIT students who got some or all of them wrong (7% got none right with 45% getting only some correct). Although it certainly helps, just being ‘smart’ in the traditional sense does not make you completely immune to irrationality. It would be interesting to see how Davos attendees would fare with the CRT.

Frederick proposed that one of the reasons for such wide variation was that low scorers in the CRT were lazier or impatient, more prone to be ‘cognitive misers’ or give the first answer that came into their heads, relying on the their intuitions. Studies have found that bright people are often (but not always) those who are more patient, with more self-control and it is this which gives them more immunity to bias. Smart people are more likely to have developed logical rules to apply to problems and know when to apply which rule. They are also more likely to monitor their thinking and gut feelings and double check their answers logically. However.......

Bright people may have fewer biases but they are more blind to them

It's findings like these that can sometimes give us a false sense of belief in our own intelligence, an overconfidence in ourselves and can actually make us blind to those times when we might be mistaken or in the process of making a poor decision.  And here comes the bias - we tend to think that it is only other people who rely too heavily on intuition or who make fast, ill-thought out decisions.  Consequently, our more common reaction to learning about the different cognitive biases is to recognise them easily in others, but rarely see them in ourselves – what researchers call the bias blind spot.

And it's this, the fact that bright people can be the most blind to their own biases that might be a cause for the Davos delegates to worry. What is more, there is now evidence that the smarter you are, the more blind you are to your own biases. A fascinating study published last summer by Richard West, Russell Meserve and Keith Stanovich found that smart people are actually the most prone to the bias blind spot, seeing biases occurring in others, but not in themselves. Testing for the bias blind spot on biases and heuristics like: outcome bias, base-rate neglect, framing, conjunction fallacy, anchoring and confirmation bias in two different scenarios – firstly amongst students at James Madison University and secondly on a heterogeneous sample of Amazon Mechanical Turk workers[4], the majority not students, but all residents of the US - , the authors state: "We found that none of these bias blind spots were attenuated by measures of cognitive sophistication such as cognitive ability or thinking dispositions related to bias. If anything, a larger bias blind spot was associated with higher cognitive ability."[5]

The researchers discuss two possible reasons for this blindness - one looking outward, the other looking inward.

  • The ‘looking outward’ explanation is what they call 'naïve realism' – our belief that we perceive the world and its intricacies correctly; that we think we are right and more in control because we think we are thinking logically all the time...because often, we are. We don’t expect to be wrong. But we can still suffer from poor judgement or behave irrationally. The WEF note that "Perception is actually an active process of understanding, through which people construct their own version of reality".[6] And as Kathryn Shultz said in her much watched TED talk 'On Being Wrong' "The miracle of your mind isn’t that you can see the world as it is. It’s that you can see the world as it isn’t."[7]
  • The 'looking inward' explanation put forward by West and his colleagues is due to 'introspective illusion' and a failure correctly to observe our own behaviour: "The bias blind spot arises, […], because we rely on behavioural information for evaluations of others, but on introspection for evaluations of ourselves. The biases of others are easily detected in their overt behaviours, but when we introspect we will largely fail to detect the unconscious processes that are the sources of our own biases."  It seems our in-built mechanism to track what we do and how we behave is faulty since we are not aware of the many unconscious cues which can affect our behaviour.  Introspection tells us that we do something for one reason whereas in reality it may be due to an entirely different cause that we have completely failed to take into account.

A Global Survey of executives by McKinsey in 2009 found evidence which could suggest the presence of bias blind spots in business leaders. The survey found that while 80% of C-suite level executives thought that "Management admitted mistakes and killed unsuccessful initiatives in a timely manner", only 49% of sub C-suite employees concurred, suggesting that although management confidently believed they made good decisions those observing them thought otherwise...[8]

Combating bias blindness

So how can the brightest and best help themselves to avoid bias and make better decisions if they can’t always see for looking? Simply being aware of the fact that they are subject to a wide range of cognitive biases and heuristics, many of which they are unaware of, can help them to become less blind to bias. Emily Pronin – Professor of Psychology at Princeton, who has conducted considerable research into the bias blind spot together with Matthew Kugler, found that simply priming participants with an article - allegedly published in Science magazine - which discussed subconscious influences on attitudes and behaviour, could actually reduce their bias blind spot. The article began:

" 'I'll know it when I see it', runs the popular refrain. It's been used to explain how we can recognise everything from obscenity to true love. But how much can we trust what we see, or rather, what we think we see? For decades, cognitive psychologists have been discovering that there is more going on in our brains than we could ever be consciously aware of, even for a moment." The article made sure to highlight that:

"…we know less about our motivations and about the sources of our actions, judgements, and decisions than we thought."[9]

One way of diminishing the bias blind spot is simply to make people more aware of it, so they are primed to be on the look out for biases impacting on their decision-making, rather than relying on their introspections. However, at Davos, Kahneman was pessimistic about whether we can change our thought-processes simply be being more self-aware.[10]

A better approach might be the one business schools have been taking, who for some time have been looking at how to minimise certain cognitive biases (such as planning fallacy, group-think, overconfidence and confirmation bias).  Harvard Business School, McKinsey and other institutions and experts have outlined some clever strategies to help to 'de-bias' our decisions, and focus on concrete mechanisms and automatic procedures to safeguard against bias blind spots. These include better methods for analysing information or implementing more effective procedures for running meetings.  For example, Kahneman suggested to Davos attendees that they should start meetings with an anonymous vote on what the decision ought to be – to avoid groupthink. Edward de Bono devised an elegant and effective way to de-bias meetings in his 'Six Thinking Hats' system, running through six different perspectives and viewpoints systematically. This can help us break out of habitual ways of thinking - emotively and negatively for example - and force us to think creatively and optimistically too.

Further, behavioural finance expert James Montier refers us to investors who have put in place simple, yet effective mechanisms to signal evidence of bias. For example, George Soros keeps a real-time investment diary, carefully recording all his investments and all the thoughts and emotions he felt at the time which went into the decision-making, so that he can then go back to it and see whether he was successful due to skill and careful assessment, or just dumb luck...  Only with this sort of behavioural record can we accurately observe our successes and our mistakes.[11]

Lee Howell MD and Head of the Risk Response Network at the WEF says "The best way to overcome cognitive biases is to become aware of them, by engaging with others who have different perspectives."[12] Similarly, although this is often hard on our egos, asking others to be honest and tell us how they perceive our actions, and then being open-minded and non-judgemental about those different perceptions of our own behaviour could ultimately help to increase our self-awareness and change our behaviour for the future.

Self-ethnography could also contribute a solution. We are subject to so many different biases most of which are subconscious. So to understand our behaviour, whether we happen to be a Davos delegate or not, we need to spend a little more time thinking about how to capture most accurately how we think and behave.  In doing this we might well uncover a picture of ourselves we had no idea existed.  For example, we can try to disrupt our own habitual actions and thoughts, to help us become more aware of automatic behaviour.  Or we can develop ways in which to observe, detect, track, and record our own behaviour so we can see what others see in us and help us to develop a truer picture of ourselves. Picking a few measureable aspects in our lives and recording and analysing them could reveal new things about our behaviour and offer new perspectives and a new freedom to take a different path sometimes.

Less biased and blind through behavioural understanding

Being smart, whilst it may give you a better chance of evading the many heuristics and cognitive biases that impact on your decision-making, may also mean you are more prone than others to not recognising that your thinking may sometimes be biased. We often have such self-belief and confidence in our intelligence that it may not occur to us that we are capable of bias and error more often than we might expect.  Our very success in conventional intelligence tests primes us to believe that we are in fact less likely than others to get things wrong.

But perhaps the CEOs, heads of state and public figures at Davos will take heed from Daniel Kahneman and realise with some new humility that they will always suffer from bias and accept that they will rarely be able to detect it themselves. Perhaps they are even now setting up automatic safeguards, procedures and defaults to make them pause, stop and think, or see what they have not seen before, and build mechanisms in which to accurately record and observe their behaviour so that they can reflect on it. The saying 'the camera never lies' seems particularly apposite here. There's no harm in all of us adopting a new self-reflection double checking style to decision-making - and stepping back to observe and better understand our behaviour may well help us all to see the biases we were once blind to.

Read more from Crawford Hollingworth.


[1] For a summary of his talk ‘Thinking Fast and Slow’ see this link: http://www.weforum.org/sessions/summary/thinking-fast-and-slow

[2] For the three questions, see p27 of Frederick, S. “Cognitive Reflection and Decision Making” Journal of Economic Perspectives, Volume 19, Number 4, Fall 2005, p25–42. http://psych.fullerton.edu/MBIRNbAUM/PSYCH466/articles/Frederick_CRT_2005.pdf

[3] Stanovich, K.E. 1999. Who is Rational?: Studies of Individual Differences in Reasoning. Lawrence Erlbaum. Mahwah, New Jersey; Stanovich, K. E. and West, R. 2002. Individual differences in reasoning: Implications for the rationality debate?  pp. 421-440 in T. Gilovich, D. Griffin, and D. Kahneman [eds]. Heuristics & Biases: The Psychology of Intuitive Judgment. New York. Cambridge University Press; Kahneman & Frederick (2002), ‘A model of heuristic judgement’ published in T. Gilovich, D. Griffin, and D. Kahneman [eds]. Heuristics of Intuitive Judgment. New York. Cambridge University Press

[4]Check out this article in the NYT which explains how Mechanical Turk Workers work: http://www.nytimes.com/2007/03/25/business/yourmoney/25Stream.html?ex=1332475200&en=cd1ce5cc4ee647d5&ei=5090&partner=rssuserland&emc=rss

[5] West, R.F., Meserve, R.J., Stanovich, K.E. “Cognitive Sophistication Does Not Attentuate the Bias Blind Spot” Journal of Personality and Social Psychology, 2012, Vol. 103, No.3, 506-519

[6] World Economic Forum ‘Global Risks Report 2013’ p20: http://www3.weforum.org/docs/WEF_GlobalRisks_Report_2013.pdf

[7] Kathryn Schultz at TED ‘On being Wrong’, March 2011. Schultz’s talk now has well over 1 million views. http://www.ted.com/talks/kathryn_schulz_on_being_wrong.html

[8] McKinsey “When to trust your gut” 2009

[9] Pronin, E., & Kugler, M. B. (2007). Valuing thoughts, ignoring behavior: The introspection illusion as a source of the bias blind spot. Journal of Experimental Social Psychology, 43, 565-578.

[10] John Gapper, FT: ‘Daniel Kahneman on double optimism of CEOs’, 23 Jan 2013: http://blogs.ft.com/businessblog/2013/01/daniel-kahneman-on-double-optimism-of-ceos/

[11] Montier, J. “The Little Book of Behavioural Investing” Wiley, 2010, p149-152

[12] Lee Howell, Managing Director, and Head of the Risk Response Network at the World Economic Forum. http://forumblog.org/2012/09/global-innovation-through-risk-resilience/

 

Newsletter

Enjoy this? Get more.

Our monthly newsletter, The Edit, curates the very best of our latest content including articles, podcasts, video.

CAPTCHA
2 + 4 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

Become a member

Not a member yet?

Now it's time for you and your team to get involved. Get access to world-class events, exclusive publications, professional development, partner discounts and the chance to grow your network.