It's a common theme: A senior leader makes a safety-related decision and expects a specific outcome or at least progression toward a goal, but his directive flies past the workforce, not even making a dent in employee behavior. For the leader this is inexplicable and frustrating. He doesn't understand why employees don't appreciate his efforts to improve their safety. But from the employee's perspective, the leader's decision only reinforces the belief that he doesn't truly value their safety—after all why would his decisions undermine a shared goal if he really cared?
This is a problem of misalignment. The leader's decision-making strategy, while well meaning, can work at cross-purposes to his intentions and objectives—sending the wrong message to employees. The leader may believe his decisions will advance safety when in fact they are undercutting progress and disheartening individuals. This paper explores how misalignment develops and offers corrective mechanisms that refine the leader's situational decision-making practices to better support safety.
Situational decision making is a skill that leaders need to master if they want to restore uniformity between intention and action, send a consistent message to the workforce, and create behavioral reliability on the shop floor.
No leader wants his decisions to run counter to his goals or make workers feel he doesn't care about their safety. Unfortunately, this happens all the time. When it comes to safety, what we don't know can hurt us and others—literally.
Many safety-related decisions require that leaders make accurate judgments about future likelihoods. After an undesirable outcome, it often seems clear what should have been decided. When we look carefully at what we knew before the event, we often see that we had all the information we needed to make a safety-supporting decision, but we didn't pay attention to it. Why not? A rich scientific literature in cognitive psychology helps explain this seemingly irrational phenomenon: human beings tend to make inaccurate judgments about future probabilities in predictable ways. These tendencies toward faulty judgments are called cognitive biases, and they can lead to serious safety problems.