The Weekly Reflektion 41/2021

As systems become more complicated and control systems more sophisticated, Human-Machine Interface (HMI) is becoming increasingly important.

The wreckage of the Airbus A300B4-622R that crashed at Nagoya Airport

Do you have good control of HMI in your operation?

China Airlines flight 140, flying from Taiwan, was making its approach to Nagoya Airport in Japan on 26th April 1994. The first officer was flying the aircraft and he was experiencing problems pushing the nose down to land the plane. The captain took over control, but he also could not push the nose down. He decided to ‘go around’ and pulled back on the controls. The aircraft went into a steep climb and the captain could not recover the situation before the aircraft stalled and then crashed. 264 of the 271 people on board died.

The aircraft was an Airbus A300B4-622R and used the latest in fly-by-wire technology including automatic actions by the flight computer. The first officer was inexperienced and the captain, although an experienced pilot on the Boeing 747, was relatively inexperienced on the Airbus. During the approach the first officer inadvertently triggered the ‘go around’ mode. This was displayed in the control panel however the display was not clear, and no audible alarm was given. The flight computer initiated a go-around automatically and moved the horizonal stabiliser to its full-nose up position. The first officer, who still intended to land the aircraft, tried to push the nose down, however the automatic pilot was not designed to disengage on a go-around if the pilot tried to override the commands. When the captain took over control and realised that he could not land he initiated a manual ‘go-around’. The manual actions to lift the nose added to the automatic actions and the plane climbed and stalled.

Following earlier incidents, a software modification was available before the Nagoya crash. It would have allowed pilot pressure on the stick, to disengage the autopilot. Taiwan’s Civil Aeronautics Administration (CAA) claimed that Airbus failed to notify its customers of the necessary modification.

In addition to the needed software change, the investigators pointed to the absence of any cockpit warning-system to indicate that the aircraft was in the wrong mode. The flight manual was also criticised for unclear instructions on how to correct such a mistake. The investigation report also pointed out the crew’s lack of familiarisation with the systems and poor pilot/co-pilot communications. 

In our Reflektion in week 19 in 2021 we told the story of Loganair flight BE-6780 where the crew also fought against the automatic pilot and were only seconds from disaster when the automatic pilot disengaged, and the crew recovered control. The people on board China Airlines flight 140 were not so lucky.

The interface between the control system and the human is a fragile area and needs careful and constant attention. Often the people designing and building the systems that control equipment make assumptions on how the people operating the equipment will respond in certain situations. Sometimes the people designing and building the systems do not even think about the operator that will be using the equipment. The Human Machine Interface (HMI) becomes an interface between the mental models of the designer and the mental models of the operator. If these mental models are not compatible, then there is an accident waiting to happen. To make them compatible, get them talking to each other.

Reflekt AS