This paper will examine the impact of cognitive bias, the missing element, in fatality prevention. Cognitive biases lead us to take shortcuts. Shortcuts can do two things: they simplify decision-making, and they potentially lead to errors or fatalities. Any time people have to make judgments in complex or ambiguous situations, there is a risk that cognitive bias can undermine effectiveness and compromise safety and lives.
After defining cognitive bias, the author gives a manufacturing safety example that will help you to understand the many ways that cognitive bias can affect decisions. Following this example is a section that helps you understand how to put the knowledge of cognitive bias to work for you instead of against you.
Every day, organizational leaders make decisions under a variety of pressures. They must satisfy different constituents and meet budgets and schedules, while at the same time managing the risks inherent in what we call the Working Interface, the relationship between people and systems. Unfortunately, decisions made under the pressures of meeting organizational objectives often turn out to be dead wrong.
For example, often it's only after the investigation has been completed that we consider what alternative courses could have been taken, or what other decisions could have been made that would have prevented an accident or fatality. We then realize we made judgments that were incorrect-and unnecessarily so. Decisions were based on flawed judgments, usually concerning our assessment of future probabilities. No leader likes to find himself/herself in the position of having made flawed judgments.
Here are some common questions that leaders might ask themselves when making decisions. How likely is it that foam falling off the fuel tank will puncture the wing of the Space Shuttle? Is the use of lockout/tagout procedures adequate to ensure workers will not contact energized equipment? Are the systems designed to control runaway chemical reactions sufficient to avoid major incidents?
Underlying each of these questions, a leader must make accurate judgments about future probabilities, judgments under uncertainty. How are these judgments made? Do we rely on past experience to answer these kinds of questions-"we've always done it that way and nothing has happened so far"-or do we require rigorous analytic methods to demonstrate that it's safe? What kind of information does it take to cause us to respond? After the accident, it seems clear what should have been decided. But if we look carefully at what we knew before the accident, too often we had all the information we needed to make a safe decision, but we didn't pay attention to it. Every leader might be asking "Why not?"
A rich scientific literature exists in cognitive psychology on the topic of cognitive bias that can help to answer these questions. It turns out that human beings tend to make inaccurate judgments about future probabilities in predictable ways. These tendencies toward faulty judgments are called "cognitive biases," and we know quite a bit about them.