Human error is cited as a contributing factor in the majority (up to 80%) of industrial accidents and incidents. The Human Factors Assessment and Classification System (HFACS) was adapted for use in the Oil & Gas (O&G) industry to provide a common framework to systematically classify accident contributing factors. HFACS originates from the aviation industry and is based on Reason's model of latent and active failures, the "Swiss cheese" model. The framework defines the "holes in the cheese" with four main categories of conditions/failures; Acts, Preconditions, Supervision, and Organizational Influences. Each category contains a set of failure codes, known as nanocodes.
The O&G HFACS can be applied to accidents on fixed or mobile installations for production or drilling. Integrated Operations (IO) or e-fields have introduced new ways of working and allows for the formation of virtual organizations. O&G HFACS can be used to balance the Man-Technology-Organization (MTO) development in this context.
Nanocodes from the US Department of Defence and the US Naval Safety Center were merged and adapted to form the O&G HFACS. The merged framework was reviewed and resulted in 23 changed, 61 removed, 18 added, and 144 unchanged nanocodes, leaving O&G HFACS with 185 nanocodes. Two important new aspects were the use of personnel with different cultural background and personnel's understanding of the language used for communication and documentation/procedures. The O&G HFACS was applied for validation purposes to four Norwegian offshore accidents that occurred during 2007. The framework was suitable for these accidents and revealed that latent failures on the organizational level were most prevalent, in particular oversight and procedures.
O&G HFACS provides a common framework for comparison of accidents and incidents on many levels, e.g. within the industry, a company or installation. The framework can be useful to systemize accident information retrieved through an MTO approach.
There is no unified definition of ‘human error’ in existing literature. Human error can broadly be defined as "…all those occasions in which a planned sequence of mental or physical activities fails to achieve its intended outcome, and when these failures cannot be attributed to the intervention of some change agency"[1]. This definition does not separate between slips or lapses and mistakes, but generically defines error. Human Factors (HF) can be defined as "the scientific discipline concerned with the understanding of interactions among human and other elements of a system, and the profession that applies theory, principles, data and methods to design in order to optimize human well-being and overall system performance"[2].
In aviation are typically 60–80 percent of all accidents attributed to human error [3] and studies in offshore and maritime industries show that 80 percent of accidents and incidents involve human error [4]. The US coast guard refers to the human element as a causal factor in 80–90 percent of all mishaps [5]. However, the operator represents a convenient stopping point in an event-chain model of accidents [6] and hence, these numbers may be biased. Leveson [6] lists four reasons for such operator error statistics:
only operator errors that negatively affect safety are reported,
unrealistic expectations that the operator can overcome any emergency,
operators may have to intervene unforeseen situations at the limits of the system and
hindsight often allows us to identify better decisions in retrospect.