You are on page 1of 24

Organizational Accidents

in Aviation
Jim Reason
University of Manchester
Two kinds of accidents

Individual Organizational
accidents accidents
Frequent Rare
Limited consequences Widespread consequences
Few or no defences Many defences
Limited causes Multiple causes
Slips, trips and lapses Product of new technology
Short ‘history’ Long ‘history’
Two ways of looking at
accidents
• The person approach: Focuses on the errors
and violations of individuals. Remedial
efforts directed at people at the ‘sharp end’.
• The system approach: Traces the causal
factors back into the system as a whole.
Remedial efforts directed at situations and
organisations.
The person approach

• Since human actions are implicated in 80-


90 per cent of accidents
• And human actions are perceived as under
voluntary control
• And behaviour is viewed as being the least
constrained factor in the prior events
• Then accidents must be due to carelessness,
negligence, incompetence, recklessness, etc.
Although . . .
• Blaming individuals is emotionally
satisfying and legally convenient
• It gets us nowhere
• Fallibility is part of the human condition
• You can’t change the human condition
• But you can change the conditions in which
humans work
The system approach

• Accidents arise from a (usually rare) linked


sequence of failures in the many defences,
safeguards, barriers and controls established
to protect against known hazards
• The important questions are:
– How and why did the defences fail?
– What can we do to reduce the chances of a
recurrence?
Hazards, losses & defences

Defences

Losses
Hazards

‘Hard’ defences
‘Soft’ defences
The Swiss cheese model of
accident causation
Some holes due
to active failures Hazards

Other holes due to


Losses latent conditions
(resident ‘pathogens’)

Successive layers of defences, barriers, & safeguards


How and why defences fail
Defences

Losses
Hazards

Latent Causes
condition Unsafe acts Investigation
pathways

Local workplace factors

Organisational factors
Types of unsafe act
• Errors
– Slips, lapses and fumbles
– Mistakes
• Rule-based mistakes
• Knowledge-based mistakes
• Violations
– Routine violations
– Violations for kicks
– Situational violations
Two influential accidents
• Mount Erebus, 1979: one accident, two
inquiries:
– Chippindale Report (‘pilot error’)
– Mahon Report (‘orchestrated litany of lies’)
• Dryden, 1989: Moshansky Report, an
indictment of the entire Canadian air
transport system.
Three aviation applications
of the ‘Swiss cheese’ model
• Bureau of Air Safety Investigation (BASI),
Canberra.
• International Civil Aviation Organization
(ICAO): Amendment to Annex 13, the
guide to accident investigators.
• As applied by an airline to a recent air
accident in North America.
BASI
• 1n the early 1990s, BASI resolved to apply the
model to all accident investigations.
• In June 1993, a small commuter aircraft crashed at
Young, NSW. All died.
• The BASI report focused on the deficiencies of the
regulator, the Oz CAA.
• Following a similar accident in 1994, the Oz CAA
was disbanded. Replaced by CASA.
ICAO Accident Investigation
Divisional Meeting (2/92)
Traditionally, investigations have been limited to
the persons directly involved. Current accident
prevention views supported the notion that additional
preventive measures could be derived from
investigations if management policies and
organisational factors were also investigated.
(excerpt from minutes)

Implemented in 8th Edition Annex 13 (1994)


The ‘Harrytown’ accident
• A modern 50-passenger glass cockpit jet
aircraft hit the runway wing down on
landing. Slid off and ended up in trees
2000ft away. Nine people injured.
• F/O flying. Temp -8oC. Visibility 1/8 mile.
Fog, snowy, no wind. 23.47 hrs.
• Both pilots were seriously surprised.
+ = AND gates (each necessary, none sufficient)
= Causal pathways Prone to inducing ‘ground shyness’

Stick pusher problems?


Aircraft
HARRYTOWN: handling + Prone to stall wing down?
features
SPECULATIVE No leading edge flaps?
EVENT TREE Leading edge contamination?

Nose-up response to power increase?


Stall +

Poor visibility (white hole)

Wing hits Inexperienced copilot (jet)


+
ground assigned landing (SOPs?)

Too Inappropriate aircraft


slow +
attitude at 100ft and below

Inadequate monitoring
Too low by Captain (SOPs)?
Sufficient?
Delayed go-around order
Implications of event tree - 1
• Two main clusters of contributing factors:
– those relating to the aircraft
– those relating to handling and flight operations
• Two main pathways for back-tracking:
– to the manufacturer (not our immediate
concern)
– to flight operations and the system as a whole
(the priority pathway)
Implications of event tree - 2
• The Harrytown accident involved the
combination of several contributing factors
that were very hard to anticipate.
• The local circumstances were such that it
took very little in the way of less-than-
adequate pilot performance to push the
system over the edge.
• This was an ‘organizational accident’.
Pruned event tree
ORGANIZATIONAL ISSUES
Prone to inducing
TRAINING
‘ground shyness’
Aircraft
factors +
Nose-up response
TRAINING
to power increase

+ Poor visibility TRAINING, SOPs

Inexperienced co-pilot (jet)


assigned landing SOPs, HIRING, etc.

Operational Problems with aircraft


factors + TRAINING, CHECKING
attitude at 100ft and below

Monitoring and cross-


TRAINING, HIRING, SOPs
checking problems

Delayed go-around order TRAINING, SOPs


Defences that failed in
Harrytown accident

Stall protection
system
Airmanship

Training, checking,
SOPs
Hiring, placement, contracts,
exposure to safe culture
Key issues for review

• Operating procedures
• Training
• Checking
• Hiring and placement of pilots
• Assimilation of new hires into
airline culture
Comments on the culture

• A strong culture: Embodied in a few widely


known and well understood beliefs and
values.
• A safe culture: Values solidity, reliability,
accuracy; proud to be ‘dull’ in the pursuit of
quality and safety.
• A collective culture: No one person is
indispensable, interchangeable units.
The moral
• No point replacing ‘pilot error’ attribution
with ‘management error’.
• All top level decisions, even sound
commercial ones, have a downside for
someone, somewhere in the system at some
time. All create resident pathogens.
• Challenge: to identify and rectify latent
conditions before they combine to cause
accidents.
Accident/incident questions
• What defences failed?
• How did they fail?
• Why did they fail?
– Unsafe acts?
– Team factors?
– Workplace factors
– Technical factors?
– Organizational factors?

You might also like