top of page
black1.png

Black Box Thinking:

Why Most People Never Learn from their Mistakes—But Some Do

​

by Matthew Syed

After finishing this book in April of 2024, I wrote,

 

"This book is all about learning how to create an attitude and methodology of openly looking for, acknowledging, and learning from our mistakes."

​

My clippings below collapse a 325-page book into 8 pages, measured by using 12-point type in Microsoft Word." 

​

See all my book recommendations.  

​

Here are the selections I made:

Part I | THE LOGIC OF FAILURE

​

 In a separate investigation , Lucian Leape , a Harvard University professor , put the overall numbers higher . In a comprehensive study , he estimated that a million patients are injured by errors during hospital treatment and that 120,000 die each year in America alone . 7

 

 

Doctors were effectively killing patients for the better part of 1,700 years not because they lacked intelligence or compassion , but because they did not recognize the flaws in their own procedures . If they had conducted a clinical trial ( an idea we will return to ) , * they would have spotted the defects in bloodletting : and this would have set the stage for progress .

 

 

So , just to reemphasize , for our purposes a closed loop is where failure doesn’t lead to progress because information on errors and weaknesses is misinterpreted or ignored ; an open loop does lead to progress because the feedback is rationally acted upon . )

 

 

historically , health - care institutions have not routinely collected data on how accidents happen , and so cannot detect meaningful patterns , let alone learn from them .

 

 

As Eleanor Roosevelt put it : “ Learn from the mistakes of others . You can’t live long enough to make them all yourself . ”

 

 

Attention , it turns out , is a scarce resource : if you focus on one thing , you will lose awareness of other things .

 

 

Social hierarchies inhibit assertiveness . We talk to those in authority in what is called “ mitigated language . ”

 

 

That is one of the ways that closed loops perpetuate : when people don’t interrogate errors , they sometimes don’t even know they have made one ( even if they suspect they may have ) .

 

 

it is about the willingness and tenacity to investigate the lessons that often exist when we fail , but which we rarely exploit . It is about creating systems and cultures that enable organizations to learn from errors , rather than being threatened by them .

 

 

Failure is rich in learning opportunities for a simple reason : in many of its guises , it represents a violation of expectation . 6 It is showing us that the world is in some sense different from the way we imagined it to be .

 

 

Failure is thus a signpost . It reveals a feature of our world we hadn’t grasped fully and offers vital clues about how to update our models , strategies , and behaviors .

 

 

Psychologists often make a distinction between mistakes where we already know the right answer and mistakes where we don’t .

 

 

“ The history of science , like the history of all human ideas , is a history of . . . error , ” Popper wrote . “ But science is one of the very few human activities — perhaps the only one — in which errors are systematically criticized and fairly often , in time , corrected . This is why we can say that , in science , we learn from our mistakes and why we can speak clearly and sensibly about making progress

 

 

Most closed loops exist because people deny failure or try to spin it . With pseudosciences the problem is more structural . They have been designed , wittingly or otherwise , to make failure

 

 

impossible . That is why , to their adherents , they are so mesmerizing . They are compatible with everything that happens . But that also means they cannot learn from anything .

 

 

Science has often been regarded as a quest for confirmation . Scientists observe nature , create theories , and then seek to prove them by amassing as much supporting evidence as possible . But we can now see that this is only a part of the truth . Science is not just about confirmation , it is also about falsification .

 

 

Knowledge does not progress merely by gathering confirmatory data , but by looking for contradictory data .

 

 

Aviation is different from science but it is underpinned by a similar spirit . After all , an airplane journey represents a kind of hypothesis : namely , that this aircraft , with this design , these pilots , and this system of air traffic control , will reach its destination safely . Each flight represents a kind of test . A crash , in a certain sense , represents a falsification of the hypothesis . That is why accidents have a particular significance in improving system safety , rather as falsification drives science .

 

 

In many cases , the only way to drive improvement is to find a way of “ turning the lights on . ” Without access to the “ error signal , ” one could spend years in training or in a profession without improving at all .

 

 

The difference between aviation and health care is sometimes couched in the language of incentives . When pilots make mistakes , it results in their own deaths . When a doctor makes a mistake , it results in the death of someone else . That is why pilots are better motivated than doctors to reduce mistakes .

 

 

But this analysis misses the crucial point . Remember that pilots died in large numbers in the early days of aviation . This was not because they lacked the incentive to live , but because the system had so many flaws . Failure is inevitable in a complex world . This is precisely why learning from mistakes is so imperative .

 

 

In 1601 , Captain James Lancaster , an English sailor , performed an experiment on the prevention of scurvy , one of the biggest killers at sea . On one of four ships bound for India , he prescribed three teaspoons of lemon juice a day for the crew . By the halfway point 110 men out of 278 had died on the other three ships . On the lemon - supplied ship , however , everyone survived .

 

 

But it took another 194 years for the British Royal Navy to enact new dietary guidelines . And it wasn’t until 1865 that the British Board of Trade created similar guidelines for the merchant fleet . That is a glacial adoption rate . “ The total time from Lancaster’s definitive demonstration of how to prevent scurvy to adoption across the British Empire was 264 years , ” Gillam says . 28

​

​

Part II | COGNITIVE DISSONANCE

 

 

A study by Terrance Odean , professor of finance at UC Berkeley , found that the winning stocks investors sold outperformed the losing stocks they didn’t sell by 3.4 percent . In other words , people were holding on to losing stocks too long because they couldn’t bring themselves to admit they had made a mistake . Even professional stock pickers — supposedly ultra - rational people who operate according to cold , hard logic — are susceptible : they tend to hold losing stocks around 25 percent longer than winning stocks . 12

 

 

As the philosopher Karl Popper wrote : “ For if we are uncritical we shall always find what we want : we shall look for , and find , confirmations , and we shall look away from , and not see , whatever might be dangerous to our pet theories . In this way it is only too easy to obtain . . . overwhelming evidence in favor of a theory which , if approached critically , would have been refuted . ”

 

 

Lysenko had publicly come out in favor of a technique of close planting of crop seeds in order to increase output . The theory was that plants of the same species would not compete with each other for nutrients . This fitted in with Marxist and Maoist ideas about organisms from the same class living in harmony rather than in competition . “ With company , they grow easy , ” Mao told colleagues . “ When they grow together , they will be comfortable . ” The Chinese leader drew up an eight - point Lysenko - inspired blueprint for the Great Leap Forward , and persecuted Western - trained scientists and geneticists with the same kind of ferocity as in the Soviet Union .

 

 

The theory of close - planting should have been put to the test . It should have been subject to possible failure . Instead it was adopted on ideological grounds . “ In Southern China , a density of 1.5 million seedlings per 2.5 acres was usually the norm , ” Jasper Becker writes in Hungry Ghosts , Mao’s Secret Famine . “ But in 1958 , peasants were ordered to plant 6.5 million per 2.5 acres . ” Too late , it was discovered that the seeds did indeed compete with each other , stunting growth and damaging yields . It contributed to one of the worst disasters in Chinese history , a tragedy that even now has not been fully revealed . Historians estimate that between 20 and 43 million people died during one of the most devastating famines in human history .

​

​

Part III | CONFRONTING COMPLEXITY

 

 

The failure of companies in a free market , then , is not a defect of the system , or an unfortunate by - product of competition ; rather , it is an indispensable aspect of any evolutionary process . According to one economist , 10 percent of American companies go bankrupt every year . 4 The economist Joseph Schumpeter called this “ creative destruction . ”

 

 

It turns out , however , that there is a profound obstacle to testing , a barrier that prevents many of us from harnessing the upsides of the evolutionary process . It can be summarized simply , although the ramifications are surprisingly deep : we are hardwired to think that the world is simpler than it really is . And if the world is simple , why bother to conduct tests ? If we already have the answers , why would we feel inclined to challenge them ?

 

 

That is the power of the narrative fallacy . We are so eager to impose patterns upon what we see , so hardwired to provide explanations that we are capable of “ explaining ” opposite outcomes with the same cause without noticing the inconsistency

 

 

But think about what this means in practice . If we view the world as simple , we are going to expect to understand it without the need for testing and learning . The narrative fallacy , in effect , biases us toward top - down rather than bottom - up . We are going to trust our hunches , our existing knowledge , and the stories that we tell ourselves about the problems we face , rather than testing our assumptions , seeing their flaws , and learning .

 

 

In their book Art and Fear David Bayles and Ted Orland tell the story of a ceramics teacher who announced on the opening day of class that he was dividing the students into two groups . Half were told that they would be graded on quantity . On the final day of term , the teacher said he would come to class with some scales and weigh the pots they had made . They would get an “ A ” for 50 lbs of pots , a “ B ” for 40 lbs , and so on . The other half would be graded on quality . They just had to bring along their one , perfect pot . The results were emphatic : the works of highest quality were all produced by the group graded for quantity . As Bayles and Orland put it : “ It seems that while the ‘ quantity ’ group was busily churning out piles of work — and learning from their mistakes — the ‘ quality ’ group had sat theorizing about perfection , and in the end had little more to show for their efforts than grandiose theories and a pile of dead clay . ”

​

​

Part IV | SMALL STEPS AND GIANT LEAPS

 

 

Britain had never had a winner of the Tour de France since the race was established in 1903 .

 

 

Sunspots , for example , were discovered by four scientists in four different countries in 1611 .

 

 

The forerunner to the first electric battery was invented by Ewald Georg von Kleist in 1745 and Andreas Cuneus of Leyden in 1746 .

 

 

Failure has many dimensions , many subtle meanings , but unless we see it in a new light , as a friend rather than a foe , it will remain woefully underexploited . Andrew Stanton , director of Finding Nemo and WALL - E , has said : My strategy has always been : be wrong as fast as we can . . . which basically means , we’re gonna screw up , let’s just admit that . Let’s not be afraid of that . But let’s do it as fast as we can so we can get to the answer . You can’t get to adulthood before you go through puberty . I won’t get it right the first time , but I will get it wrong really soon , really quickly .

​

​

Part V | THE BLAME GAME

 

 

According to one report by Harvard Business School , it was found that executives believe that around 2 to 5 percent of the failures in their organizations were “ truly blameworthy . ” But when asked how many of these mistakes were treated as blameworthy , they admitted that the number was “ between 70 to 90 percent . ”

 

 

As the philosopher Karl Popper put it : “ True ignorance is not the absence of knowledge , but the refusal to acquire it . ”

 

 

But what the Oscar November incident reveals is that even a pioneering industry like aviation is not completely immune from the blame tendency . And perhaps it exposes , more than anything , just how far we need to travel to eradicate the blame instinct once and for all .

Screenshot 2024-04-24 095114.png
bottom of page