Improvement potential for US civil aviation
U.S. civil airlines move humans with less risk than any mode ever devised. But there’s still room for improvement.
In pursuit of safety we must be careful not to price us out of reach which would lower overall transportation safety by moving travelers to their more dangerous automobiles.
Every airline’s pilot training includes some blurb about being the “best in the business.” That’s good for encouragement and team building and all that but undermines the immutable fact of human frailty.
Great pilots have caused crashes in mundane ways. Usually through simple errors compounded unexpectedly that cause unexpected results—sometimes tragically.
Distraction and complacency may be the two biggest enemies of safety while technology, properly used, its best defense. And it’s not great pilot skill that will prevents most accidents, but rather the far more mundane practice of discipline. For example: actually looking at items on a checklist, not accepting substandard performance on an approach, biting the bullet to write up a broken item on that last leg home, etc. These are the things that would prevent most mishaps.
Safety improvements don’t happen at the wave of someone’s hand—even hands high in the management heap. They happen because someone champions the improvement and convince those in power that the benefit is worth the expense. Don’t be fooled into thinking that just because something is safer it will be done. Expense must be weighed. That is why the unpopular analysis of “cost per life saved” is so important—it ranks safety improvements by their efficacy.
Our aviation system has been ingeniously tweaked over the years to overcome many human failings but there is room for improvement. This section is is devoted to that effort.
People Don’t Fail, Processes Do
This great article addresses the myths of brushing off accidents as “Pilot Error,” a conclusion as useless as saying a death was caused by “gunshot wound.” While true, it sure doesn’t shed much useful light on what we’re really interested in: who done it.
In case the Article link is broken, the content is included below.
Feature The myths of pilot error
By staff writers – May 1, 2017 Flight Safety Australia (Original Article Here)Since the legend of Icarus, pilots have copped the blame for not following policies (i.e. don’t fly too close to the sun) and causing aircraft crashes. Over the century of powered flight, innumerable newsreaders, stories and blogs have droned ‘pilot error is suspected’, before moving on to the day’s next story, such as in the case of the recent accident at the Essendon aerodrome.
For anyone involved in aviation, the words ‘pilot error’ are, or should be, immensely aggravating. Pilot error, or controller error or engineer error are facts of life in aviation, because humans are involved in all these jobs. In fact, to blame pilots out-of-hand without evidence is also human error—it is not a useful explanation of why the accident occurred.
This is by no means advocating that aviation should not strive to minimise the errors within one’s performance. The lay-person’s perception that aviation safety may be conceived only in such black and white terms of error and an appropriate punishment is unfortunately the antithesis of the thinking of modern aviation. The well-embedded culture that error is a fact of human existence that needs to be understood, even embraced, rather than denied, is slowly creeping into more and more professions outside of aviation, such as medicine, ground transport and mining. However, even within an industry with insight and awareness, the cognitive processes that aid our thinking can at times lead to faulty conclusions due to over-reliance on heuristics and biases (Tversky & Kahneman, 1974). In fact, we’ll tackle a few in this article that seem to keep coming up in spite of our best efforts.
Myth 1: Good pilots don’t make errors
‘That myth should have been long dispelled,’ says the new CASA human factors fatigue specialist Dr Robert Forsterlee. ‘For example, research on line operations safety audits (LOSA) where pilots observe the performance of their peers during routine flights and log their errors, has shown that firstly, there are always inconsequential errors such as inputting wrong information like weights or altitudes, giving erroneous read-backs, etc. While some errors result from individual performance factors, others may be systemically based. Secondly, although the professionalism of pilots is based on accurate self-assessment, they cannot account for every mistake they make. That’s just human nature, it’s not pilots in particular.’
In his studies of mining and oil drilling, sociologist Andrew Hopkins found, like many others, that human error is universal. ‘Human failure is a fact of life and pilots make errors.’ He cites studies that show airline pilots make between one and five errors per flight sector.
Human factors researcher Matthew Thomas says, ‘Human error is now accepted as a natural part of everyday performance, and can occur both spontaneously or can be precipitated by a variety of environmental and personal factors, such as individual proficiency, workload, fatigue and team-dynamics.’
Myth 2: Error equals incompetence
If you want to make a psychologist snort, ask them about accident proneness—the idea that some people are likely to have accidents because of inherent clumsiness, risk taking or stupidity. As a concept, it is ‘folklore’, according to one review of the evidence. ‘It is not the concept of accident proneness that is being questioned … but rather the unsubstantiated claims that it is an identifiable constellation of personality traits that can be predicted using psychological tests,’ Mark Rodgers and Robert Blanchard write. Studies of accident proneness need to demonstrate large samples, stability of accident proneness over time and to account for the impact of ‘situational and circumstantial variables,’ the unique circumstances of every accident, they say.
In The Limits of Expertise, NASA Ames Research Centre psychologist R. Key Dismukes (with Ben Berman and Loukia Loukopoulos) demolishes the myth that error signals incompetence. ‘Many people assume that if an expert in some domain (aviation, medicine or any other) makes an error, this is evidence of lack of skill, vigilance or conscientiousness. This assumption is both simplistic and wrong,’ the authors write.
‘Skill, vigilance and conscientiousness are of course essential for safe, effective performance, but are not sufficient. A particularly problematic misconception about the nature of skilled human performance is that if experts can normally perform some task without difficulty then they should always be able to perform that task correctly. But in fact experts in all domains from time to time make inadvertent errors in tasks they normally perform without difficulty.
Dismukes and his colleagues say human error is the consequence of the interaction of subtle variations in task demands, incomplete information available to the expert performing the task, and ‘the inherent nature of the cognitive processes that enable skilled performance’.
They also emphasise that ‘skill and error spring from the same perceptual and cognitive processes,’ acknowledging James Reason’s work in elaborating this insight.
To err, in other words, is human. And because experts are human, they err too. Forsterlee agrees that ‘the best people make the worst mistakes,’ and cites the case of Jacob Veldhuyzen van Zanten, the captain of KLM flight 4805, that was destroyed in aviation’s worst disaster, the Tenerife runway collision that killed 583 people. The captain was head of KLM’s check and training department and the airline had at first called him to investigate the disaster before realising he was involved in it.
Seniority and competence add a dangerous weight to errors, Forsterlee says. ‘It’s when the gurus make mistakes that may be the most damaging,’ he says. ‘When a guru makes a mistake, even at times when people can see it, they don’t believe it or they think it’s not a mistake in this case.’
Myth 3: Errors explain accidents
‘To think in terms of error and blame is itself an error,’ Andrew Hopkins says. ‘The concept of blame is counterproductive from an explanatory point of view. We have two paradigms of thinking; the blame paradigm and the explanatory paradigm and these two things are incommensurable. They don’t meet in any way.’
Dismukes and colleagues say aviation accident investigation must go beyond the safe conclusion of crew error and look at the circumstances that were associated with the error. ‘It is inconsistent with the fundamental realities of human performance and counterproductive to safety, to focus too narrowly on crew errors or to assign the primary causes of the accident simply to those errors. Rather, almost all accidents involve vulnerabilities in a complex sociotechnical operating system and causalities lie in the probabilistic influence of many factors, of which pilot error is only one. The object is to design the operating system to be resilient to the equipment failures, unexpected events and human errors that inevitably occur.
Hopkins says a simple but useful way to get past a simplistic focus on error is the ‘five whys’ method adopted by Toyota as part of its distinctive production system. The pioneer of that system, Taiichi Ohno, coined the saying, ‘Ask “why” five times about every matter.’
When confronted with an oil spill on a showroom floor the five whys method would ask.
Why is there a puddle of oil? Because the engine is leaking oil.
Why is the engine leaking? Because the gasket has failed.
Why has the gasket failed? Because it’s a low-quality item.
Why are we using low-quality items? Because the purchasing agent is being remunerated for cost savings.
Why is the purchasing agent being remunerated for cost savings? Because company policy values saving money more than quality.
‘The Toyota production system says, “People don’t fail, processes do,” and using it almost always brings you to a process that can be improved,’ Hopkins says.
Myth 4: I wouldn’t make that error
‘With the benefit of hindsight it’s easy to say no way would I have done such a thing,’ Forsterlee says. ‘But you weren’t in that situation, on that day, with that information.’
There are a number of contributors to this mindset such as: optimism bias or the misperception that one is less likely than others to incur a negative consequence; over-estimating skill and under-estimating risk; and the illusion of control phenomenon, which occurs when people trying to obtain a desired outcome that occurred independently of their behaviour tended to believe that they were controlling it.
Understanding that human error, including your own, is inevitable; it’s the beginning of improvement, he says. ‘All champions lose games. But they’re champions because they learn from their mistakes, and their losses.’
‘Everybody should be aware of the fallibility of humans. But there are always the personality types who, in the face of all the factual external evidence on the planet, say “I have an alternative explanation that proves I am not wrong.” That’s why highlighting susceptibility to belief in these myths is so important.’
Self-awareness and ability to improve has little to do with age or experience. Professionalism is the continued self-scrutiny to ensure you continue to develop—this is not possible without the ability to accept feedback. Unfortunately, there are people who externalise their lack of achievement and professional growth to circumstances or others rather than their own efforts.
Dismukes and his colleagues found the notion that the crews of doomed aircraft were unusually or grievously deficient does not withstand examination. ‘For the most part, crew performance during these accident flights, to the extent it can be determined from the NTSB reports, seems quite similar to that of the many crew we and others have observed in actual flight observations and in recurrent flight training simulations,’ the authors write.
‘Even the few accidents that involved deviation from explicit guidance or standard operating procedures should be interpreted cautiously; one should consider the actual norms for operating on the line—is the accident crew’s deviation unique or do similar deviations occur occasionally, or even frequently in similar conditions?’
Dismukes also has some sobering thoughts on the role of luck in crashes. ‘I find it ironic that in some windshear accidents the crew was faulted for continuing an approach even though an aircraft landed without mishap one minute ahead of the accident aircraft. Both crews had the same information, both made the same decision, but for one crew luck ran the wrong way.’
Interruption and errors: remembering to do what’s right
On 20 August 2008 a McDonnell Douglas MD-82 airliner lined up to take-off from Barcelona El Prat airport, then returned to the gate to attend to a high-reading engine nacelle sensor. An hour later, it took-off again only to roll right immediately on leaving the ground and crash, killing 154 of the 172 on board. The crew of Spanair flight 5022 had forgotten to extend the flaps for take-off, as the crew of Northwest Airlines flight 255 had done almost exactly 21 years earlier, in another MD-82, on 16 August 1987. Both crews had been interrupted in their take-off preparations and both aircraft had inoperative take-off configuration warnings.
Memory and error was a specialty of Dismukes, who recently retired from NASA after a distinguished career investigating the psychology of mistakes in aviation. The psychologist and former airline transport-rated pilot now has time to enjoy his hobby of gliding. He recently put his academic knowledge into an admirably lucid article for his local gliding club newsletter.
‘Without going into technical detail I will give you a high-level summary of the conclusions we reached, drawing also on the findings of many other research teams.
When we looked at airline accidents in which pilots forget to perform some essential task, we found that most of the things they forgot were routine operational procedural steps, rather than unique actions they were trying to remember to do. These were fairly simple tasks that they had executed thousands of times without difficulty on previous flights, things such as setting flaps, extending the landing gear, turning on hydraulic boost pumps, or reading items on a checklist. What I’m talking about is a relatively new field of cognitive science called prospective memory: remembering to do things we intend to but must defer until a later time.
It quickly became clear that everyone occasionally forgets to do things, no matter how simple the task is, no matter how often they have performed the task previously, no matter how skilful they are at their work, no matter how conscientious they are. So why do we forget and what can we do to prevent these memory lapses?
‘Humans have a vast store of information, almost unlimited, in long-term memory. That’s where everything you know resides. But at any one moment only a tiny sliver of that information is activated and immediately available to conscious manipulation in what is called working memory.
The research shows that to retrieve the deferred intention back into awareness at the right time, we must notice something relevant; some cue must remind us, a cue that is in some way related to the intention.
Priming Memory: Error Countermeasures
For situations where you have to defer things that need to be done, Dismukes recommends creating reminder cues that will be noticed at the right time. ‘The best reminder cues are distinctive, salient, unusual, and/or block further action, such as putting an empty Styrofoam cup over the throttles of an aeroplane to remind you that there is something you must do before you take-off.’
Reminder cues include:
- Writing a note to yourself and leaving it where it will be seen.
- Forming implementation plans. ‘Explicitly identify where and when you intend to perform the deferred task; mentally note exactly where youwill be and what you will be doing when you intend to perform that task.’
- Visualise yourself performing the task.
- Execute your checklist in a slow, deliberate manner, and point to each item checked.
- If using your own checklist, keep it as short as possible and put the killer items at the top.
- Make sure the print is large and the checklist is easily accessible.
- Whenever you are interrupted—and you will be—pause before addressing the interruption to form an implementation plan.
- If your checklist is interrupted, hold it in your hand instead of puttingit down.
- Enlist other people to help you remember.
Dismukes says multitasking, performing a procedural step out of sequence, or substituting an atypical procedural step should be treated as red flags for likely error. ‘Form an implementation plan and create reminder cues,’ he says. ‘Above all, avoid rushing regardless of time pressure,’ Dismukes says. ‘Rushing at best saves a few seconds, and it increases our vulnerability to these and other types of errors enormously.’
Further information:
Dekker, S. (2004).
Ten Questions About Human Error: A New View of Human Factors and System Safety. CRC Press, USA.
Dekker, S. (2014).
The Field Guide to Understanding Human Error. CRC Press, USA.
Dismukes, R. K. (2001).
Rethinking Crew Error: Overview of a Panel Session. Research and Technology Report. Moffett Field, CA: NASA Ames Research Center.
Dismukes, R. K., Berman, B. A., & Loukopoulos, L. D. (2007).
The Limits of Expertise: Rethinking Pilot Error and the Causes of Airline Accidents. Ashgate Publishing Ltd, USA.
Helmreich, R. L., Klinect, J. R., & Wilhelm, J. A. (May 1999).
Models of Threat, Error, and CRM in Flight Operations. In Proceedings of the 10th International Symposium on Aviation Psychology (pp. 677–682).
Ohno, T (March 2006).
‘Ask “why” five times about every matter. Toyota Tradition Top.
Reason, J. (1990).
Human Error. Cambridge University Press, UK.
Rodgers, M. D., & Blanchard, R. E. (1993).
Accident Proneness: A Research Review. Federal Aviation Administration, Office of Aviation Medicine, Washington, DC.
Thomas, M. J., Petrilli, R. M., & Dawson, D. (2004).
An Exploratory Study of Error Detection Processes During Normal Line Operations. Doctoral dissertation, European Association for aviation psychology.
Tversky, A., & Kahneman, D. (1974).
Judgement under Uncertainty. Science, 185, 1124–1131.