The Weekly Reflektion 33/2022
An organisation continually receives information that is relevant to its operation and that can be used to make improvements. This of course requires a proper interpretation of that information and avoidance of unsubstantiated assumptions.
Pic.1 A steel helmet used by British soldiers in WW I.
Pic.2 Distribution of bullet holes on allied aircraft
How do you use available information to make improvements in your organization?
At the beginning of World War 1 British soldiers were issued with cloth caps. When steel helmets were introduced the number of wounded soldiers in hospital increased dramatically. This led to some speculation within the army senior management that steel helmets may not be having the desired effect and consideration was given to returning to cloth caps. An alternative explanation, again seemingly obvious, was those soldiers being admitted to hospital with head wounds would have been dead if they did not have their steel helmets.
During World War II combat aircraft would often return with bullet holes from enemy fire. The Allies initially focussed on strengthening the most damaged areas to increase survivability under fire. The above figure shows the distribution of bullet holes on returning aircraft. A mathematician, Abraham Wald, pointed out an alternative way of interpreting the available data. Aircraft that were hit in the areas where there were no bullet holes did not return from combat. This seemingly obvious conclusion was acted on and led to the reinforcement of armour in the areas where there were no bullet holes. This was particularly beneficial for the Douglas A-1 Skyraiderintroduced in 1946 and remaining in service until the early 1970s despite being a piston engine propellor driven aircraft.
In both these examples the term ‘seemingly obvious’ is used. In these examples the alternative conclusion was obvious once pointed out by someone that had the imagination to consider the information in another way. ‘Seemingly obvious’ is often a result of hindsight bias. We see it and we see the logic of it only when it is revealed by others, or we see the consequences.
An organisation continually receives information from its operation that could be relevant to making improvements and preventing incidents and accidents. Sometimes the information is clear and concise and relatively easily interpreted and acted on. This type of information is often referred to as ‘strong signals.’ We feel confident about acting on strong signals since by definition these are not so open to interpretation. Sometimes the information is vague or ambiguous and requires time and effort to interpret and of course may be open to different interpretations. This type of information is often referred to as weak signals. We often don’t feel so confident on reacting to weak signals because we are unsure what the signal is an indicator of and hence whether any actions would actually have the desired effect.
There are two concerns. The first is incorrect interpretation of information that seems to indicate a strong signal. Actions are taken on this basis, and these turn out to be wrong with sometimes serious consequences. The second is failure to respond to information that is considered a weak signal. Theinherent uncertainty in the information often leads to failure to take any action at all.