The Weekly Reflektion Week 50 / 2019

This week’s Reflektion is a follow up on week 49 Reflektion on human error and considers the concept of mental models.

Do you have ‘human error’ down as one of the main causes for accidents and incidents?

What mental model did the person(s) involved in the accident or incident have and how did this influence the decisions made and actions taken?

During start-up of the isomerization unit at BP’s Texas City oil refinery, the Central Control Room (CCR) operator monitored the level transmitter in the raffinate splitter tower as raffinate feed (hydrocarbon) was introduced. The level rose above the high-level and an alarm generated from the level transmitter sounded in the CCR. As the filling continued, the reading from the level transmitter indicated about 10 ft. and the level seemed to be falling. A separate high-level switch was not activated during the filling process.  It was normal practice to overfill the splitter on start-up although the procedures stated otherwise. The discharge valve from the raffinate splitter had been closed by the previous shift and this had not been noted in the CCR logbook. The splitter was eventually overfilled, the pressure relief valves lifted and the raffinate discharged to a blowdown drum. The blowdown drum had an open vent and a liquid drain and was not designed for the developing scenario. Hot raffinate emerged from the blowdown drum, formed a gas cloud that was ignited by a diesel pick-up truck, that was located near the blowdown drum with its engine running. 15 people were killed and 180 injured.

In the Reflektion in week 49 we talked about Causal Learning and the premise that people follow their convictions when taking actions and do not intentionally do something wrong. People’s convictions are part of what form their beliefs.  Belief in this context can be related to ‘mental models’ or ‘paradigms’. Peter Senge in his book ‘The Fifth Discipline’ describes mental models as ‘deeply ingrained assumptions, generalizations, or even pictures of images that influence how we understand the world and how we take action’. Mental models are frameworks consisting of our underlying assumptions from socialization, values, beliefs, education, and experience that helps us organize information. Put simply; our mental models dictate how we understand our world. The CCR operator at Texas City believed that what he was doing was OK. The information that he had corresponded to his mental model of a normal start-up. He had no reason to question any of the available information or to stop the start-up process.

In the Reflektion in week 49, we also talked about the process of learning in Causal Learning. Learning is not just the acquisition of new knowledge but also the creation of new beliefs and convictions. Therefore, if we want to learn then we need to change our mental models. We need to change the way we understand the world.

Peter Senge developed the ‘Ladder of Inference’ (first proposed by Chris Argyris), in the Fifth Disciple Fieldbook to describe the processes behind the decisions we make and the actions we take.

The steps on the ladder from the bottom are:

When an incident occurs we investigate and we often dive into the actions taken and identify the things that should have been done, the things that were missing and the things that were not done according to the procedures etc. etc. The measures we introduce to prevent such influences are based on this approach. We do not try to understand why the people thought it was OK to do what they did. We do not try to understand the mental model of the world that the people were using to rationalise what they did. Without this understanding, how can we influence the mental models of the people involved so that their decisions and actions are right next time?  

All very well Reflekt, but seems a bit theoretical. Do you have any practical suggestions for how to do what you are suggesting? Tune in next week.

Reflekt AS