Оценить:
 Рейтинг: 0

Eyes Wide Open: How to Make Smart Decisions in a Confusing World

Автор
Год написания книги
2019
<< 1 2 3 4 5 6 7 8 9 ... 12 >>
На страницу:
5 из 12
Настройки чтения
Размер шрифта
Высота строк
Поля

A key underlying factor that led to the disaster was the way in which NASA’s engineers shared information.

In particular, the Columbia Accident Investigation Board singled out the ‘endemic’ use of a computer program. A program we more usually associate with corporate seminars or high-school classrooms: Microsoft PowerPoint.

The investigators believed that by using PowerPoint to present the risks associated with the suspected wing damage, the potential for disaster had been significantly understated.

As the Columbia circled the earth, Boeing Corporation engineers scrambled to work out the likely consequences of the foam striking the thermal tiles on the Shuttle’s left wing. Tragically, when they presented their findings, their methods of presentation proved to be deeply flawed. Information was ‘lost’, priorities ‘misrepresented’, key explanations and supporting information ‘filtered out’. The ‘choice of headings, arrangement of information and size of bullets … served to highlight what management already believed’, while ‘uncertainties and assumptions that signalled danger dropped out of the information chain’.17 (#litres_trial_promo)

In other words, the very design tools that underpin the clarity of PowerPoint had served to eclipse the real story. They had distracted its readers. They had served to tell a partial, and highly dangerous, story.

Tufte, applying his expertise in information design, investigated these claims further, and found even more to be concerned about.

He analysed all twenty-eight PowerPoint slides that had been used by the engineers to brief NASA officials on the wing damage and its implications during Columbia’s two-week orbit of the earth, and discovered that some were highly misleading.

The title of a slide supposedly assessing the destructive potential of loose debris, ‘Review of Test Data Indicates Conservatism for Tile Penetration’, was, in Tufte’s words, ‘an exercise in misdirection’. What the title did not make clear was that the pre-flight simulation tests had used a piece of foam 640 times smaller than that which slammed against the Shuttle. This crucial information was buried towards the bottom of the PowerPoint slide. Nobody seemed to take any notice of it – they were too focused on the headline at the top, and did not take in the full picture.18 (#litres_trial_promo)

Tufte found that the limited space for text on PowerPoint slides led to the use of compressed phrases, with crucial caveats squeezed into ever smaller font sizes. This created a reliance on ‘executive summaries’ or slide titles that lost the nuance of uncertainties and qualifications.

In cases such as this, in other words, oversimplification leads to the loss of vital detail.

Complexity, a reality of executive decision-making, is something the medium of PowerPoint dangerously disregards.

It’s not just NASA or professors who are concerned about the potential of PowerPoint to blinker our vision.

General James N. Mattis, who served as Commander of the United States Central Command after taking over from the subsequently disgraced General David Petraeus in 2010, has always had a way with words. He once advised his marines in Iraq to ‘Be polite. Be professional. But have a plan to kill everybody you meet.’19 (#litres_trial_promo) Mattis’s assessment of PowerPoint was equally merciless: ‘PowerPoint makes us stupid,’ he said in 2010, at a military conference in North Carolina.

This is a feeling corroborated by his military colleagues. Brigadier General H.R. McMaster, who banned PowerPoint when leading the successful assault on the Iraqi city of Tal Afar in 2005, said, ‘It’s dangerous, because it can create the illusion of understanding and the illusion of control.’ Indeed, so aware is the US Army of the medium’s powers of evasion that it intentionally deploys PowerPoint when briefing the media, a tactic known as ‘hypnotising chickens’.20 (#litres_trial_promo)

Of course, this is not to say that we don’t ever need information to be summarised for us. There are times when we clearly do. But if we’re making decisions on the basis of summaries, we need to remind ourselves of the significant risk that in the process key details and subtleties may well be overlooked. Or be written in such small font size that you don’t pay attention to them.

We must also take a moment to remind ourselves that what someone else has deemed important may not be what matters most to us, or what will really make a difference to the decision we make.

So next time you’re looking at a PowerPoint presentation, or indeed any type of executive summary, look beyond the large font and the bold headline points. The information you actually need may be somewhat more buried. Or it may not even be on the slides at all. Do also put your presenter on the spot. If it matters, ask them to clarify and expand, and provide more information.

If you don’t, you risk being shown only tigers, and never snakes.

The Cult of the Measurable

It’s not only particular forms of presentation that can blinker our vision.

One type of information often dominates our attention – numbers. This in itself can be problematic.

Numbers can give us critical information – if we want to know whether to put a coat on, we’ll look at the temperature outside; if we want to know how well our business is doing, we’ll need to keep an eye on revenues and expenditures; and if we want to be able to compare the past with the future, we’ll need standard measures to do so.

But the problem is that the Cult of the Measurable means that things that can’t really be measured are sometimes given numerical values.21 (#litres_trial_promo)

Can a wine really be 86 per cent ‘good’? That doesn’t sound right to me, but one multi-billion-dollar industry doesn’t agree. Robert Parker, probably the most famous wine critic in the world, ranks wines on a scale between fifty and one hundred. And winemakers prostrate themselves before him, praying for a rating of ninety-two or above, because such a number practically guarantees commercial success, given how influential Parker’s ratings have become with wine drinkers.22 (#litres_trial_promo)

Yet what really is the difference between a rating of ninety-two and an eighty-six? How can we tell what that six-point difference means?

And are someone else’s tastes necessarily going to correlate with yours? I’ve already raised the question of whether Joe from Idaho on TripAdvisor is really the best person to steer your choice as to where to go on holiday. We also need to ask ourselves whether what I mean by four stars is what you mean by four stars. Whether my Zagat score of twenty means the same as yours. As the Financial Times’s wine critic Jancis Robinson points out on the subject of Parker’s influence, ‘Make no mistake about it, wine judging is every bit as subjective as the judging of any art form.’23 (#litres_trial_promo)

So, before you make a decision based on a number, think about what the number is capturing, and also what it is not telling you.

Risk is another area where our attempts to assign clear measures are often bound to fail. While it’s true that there are some areas where we can meaningfully quantify risk – such as engineering, where we can come up with a number for how likely a building is to withstand an earthquake, say; or medicine, where we can estimate a patient’s chance of responding to a particular drug – this approach doesn’t work effectively in all spheres. Indeed, in our turbocharged world, in which changes happen quicker and less predictably than ever before, many of our attempts to assign probabilities to future events are likely to be pretty meaningless. As President Obama said, reflecting on the huge range of probabilities that various senior intelligence folk had proffered back in March 2011 as to whether Osama bin Laden was the ‘high value target’ spotted within a high-walled compound in Abbottabad in Pakistan – with confidence levels that ranged between 30 and 95 per cent – ‘What you started getting was probabilities that disguised uncertainty as opposed to actually providing you with more useful information.’24 (#litres_trial_promo)

In our desire to reduce everything to some sort of standardised measure, to create universal meanings for things that will always be subjective, and to create the illusion of certainty when uncertainty is in fact what prevails, do we not risk making decisions on the basis of what may seem intelligence-rich information, but is in truth pretty meaningless?

Not everything can be measured, not everything can be compared, especially in a world as complex as ours. Indeed, Obama, realising this, responded to the various probabilities he’d been presented with, ‘Look guys, this is a flip of the coin. I can’t base this decision on the notion that we have any greater certainty than that.’25 (#litres_trial_promo) A flip of the coin which we now know that President Obama won and Osama bin Laden lost.

All That Counts

Another danger of putting the measurable on a pedestal is that what cannot be measured often ends up being discarded, or dismissed. But just because something can’t be quantified, it doesn’t mean that we should ignore what it is telling us.

A conversation with a senior director of a multi-billion-dollar international children’s charity brought this point home to me in a stark way. Huddled in a Cambridge University antechamber, encircled by six of her colleagues, Ms Broun explained that the key problem children now face in many middle-income countries was domestic violence.26 (#litres_trial_promo) She and her colleagues knew this, because they had personally witnessed countless cases of children bearing marks of physical abuse alongside symptoms of mental strain. They’d heard their stories, seen their scars, and noted their teachers’ and community leaders’ corroborating stories.

Domestic violence, however, is hard to measure. What do you record? The number of bruises? How can you capture in numbers how terrified a child feels? As a result, Ms Broun had been unable to convince the organisation’s head office that domestic violence was something vital for them to target. Instead she was told to focus her efforts on addressing problems for which they could collect numbers, and thereby easily gauge progress on – problems such as under-attendance at school, or children without sufficient vaccinations.

These instructions were given to Ms Broun despite the fact that in middle-income nations, such measurable and easily trackable problems had to a great extent been resolved, whereas domestic violence was active and growing.

In many ways it’s hard to accept this story. It speaks of bureaucratic intransigence and rigidity, and it’s also symptomatic of the way in which the Cult of the Measurable can overshadow what really needs to be seen, and what really needs doing. In this case it is incredibly painful, because the happiness and welfare of children is at stake. If the over-importance of numbers can become embedded in this kind of situation, where in your business or personal life could the Cult of Comfortable Measurement be adversely affecting your decisions?27 (#litres_trial_promo)

By devaluing that which cannot be measured, we risk not only making poorer decisions, but also distorting our priorities and goals. As was once said, ‘Not everything that counts can be counted, and not everything that can be counted counts.’28 (#litres_trial_promo)

Glass Half Full

So we need to be careful about focusing excessively on numbers, or on things in bold typeface, or on the information that others deem most relevant for us, or that glitters most brightly. We also need to start paying more attention to people’s stories and testimonies, even if they cannot be quantified.

Beyond this, there’s another type of information we should watch out for that is prone to draw us in – information we want to hear, over information we don’t.

Neuroscientist Tali Sharot explored this theme by putting volunteers in a brain scanner and asking them what they believed the chances were of various unpleasant events occurring to them in their lifetime. She asked questions like how likely are you to get burgled? How likely are you to contract genital warts? How likely are you to develop Parkinson’s disease? That sort of thing.29 (#litres_trial_promo)

After each answer, she immediately told her volunteers what the real chances were of such an event happening. So, if someone thought they had a 10 per cent chance of developing cancer, she would reveal that the real probability was 30 per cent – or quite a lot worse.

What Sharot discovered was that when her subjects were given bad news, news that should have led them to be more concerned than they were previously, the part of their brains that should have fixed the mismatch between their prediction and the true chance of disaster showed only low-level activation.

However, when a subject was given information that was better than expected – for example, if someone thought they had a 50 per cent of being burgled, but was then told that their real chance of being burgled was only 30 per cent – the part of their brain that processed the information went wild. What’s more, when the volunteers were asked to go through the list of unpleasant events again, they actively changed their beliefs when the information they had been given improved their prospects. For example, if they found out that they were actually less likely to suffer from infertility than they had thought, they adjusted to the new risk percentages presented. But if the new information pointed to there being an even higher chance of something bad happening to them, they simply ignored it.

When it comes to things that affect us directly, it seems that many of us dismiss information that suggests that bad things will happen to us, and only pay attention to the good stuff. It doesn’t matter what we throw at them, unconscious processes in our brains are determined to show us a rosy glow.

There are obvious dangers to this when it comes to making decisions. If your unconscious belief is that you won’t get lung cancer from smoking, then you’re unlikely to choose to quit. For every warning from an anti-smoking campaigner, your brain will be giving a lot more weight to that story of the ninety-nine-year-old lady who smokes fifty cigarettes a day but is still going strong. You’re not doing this consciously, but it is happening.

Similarly, if you’re a trader buying and selling stocks and shares, or an investor looking to buy another property, you’ll be paying more attention to evidence of sustained growth and stories of rising prices than to the nay-sayers who predict a crash – a partial explanation for financial and housing bubbles.30 (#litres_trial_promo)

The inability to properly process news that suggests something bad may happen to us is clearly a dangerous trait for nearly all decision-makers – not just traders or smokers or property speculators. Dr Sharot’s research reveals that 80 per cent of us are very vulnerable to this mental lapse.31 (#litres_trial_promo) Interestingly, however, there is one group of people who it turns out update their beliefs in a more balanced way: people with mild depression appear to be better at balancing the good and the bad when they receive information.

If you’re not depressed yourself, however, don’t despair. Being aware that you’re prone to this thinking error is a start – it means you can challenge your immediate reactions and reflect upon how your decisions would be affected if your optimism were overstated. You might also, now that you are cognisant of your ability to trip yourself up, want to take out insurance against the worst happening. As Sharot advises, do carry an umbrella even if you’re sure the sun will shine, and do buy health insurance even if you’re certain nothing bad could ever happen on you.32 (#litres_trial_promo)

How Not to Spot Aspirin Poisoning
<< 1 2 3 4 5 6 7 8 9 ... 12 >>
На страницу:
5 из 12

Другие электронные книги автора Noreena Hertz