The Weekly Reflektion 18/2026
Complacency is a feeling of calm self-satisfaction accompanied by an unawareness of actual dangers, deficiencies, or the need to improve. It is often characterized by a smug, uncritical contentment with the status quo, frequently leading to poor decision-making or failure to strive for better results. There is a road to complacency that is sometimes referred to as the degradation sequence and like may roads to disaster it starts off very well and with the best of intentions.
Are you on the road to complacency?

On 30 April 1986, the world’s worst nuclear disaster happened at Reactor Number Four at Chernobyl’s nuclear power plant. Complacency was considered a major factor in the disaster and our Reflektion this week looks at how complacency fits into the degradation sequence.
The degradation sequence, reliability, familiarity, dependency, complacency, and vulnerability, describes a gradual erosion of safety margins in complex systems. It is not a sudden failure, but a slow, often unnoticed shift in how people interact with technology and procedures.
Reliability is where the sequence begins. Systems that perform consistently well over time, build trust among operators. In high-risk industries like nuclear power, reliability is essential. At Chernobyl, the RBMK reactor design had operated for years, creating confidence in its stability. Operators and engineers came to believe that the system would behave predictably and safely, even under unusual conditions.
This leads to familiarity. As individuals repeatedly interact with a system without incident, they become comfortable with it. Procedures may start to feel routine rather than critical. At Chernobyl, staff were deeply familiar with the reactor’s operation. This familiarity reduced the perceived risk of conducting tests, even those that deviated from standard safety protocols.
Next comes dependency. When a system has proven reliable and familiar, operators begin to depend on it, sometimes excessively. They may trust automated safeguards or assume the system will compensate for human actions. In the Chernobyl case, operators relied heavily on the reactor’s perceived ability to handle instability. They believed that built-in safety mechanisms would prevent catastrophic outcomes, even when they disabled key protections during the test.
Over time, dependency can evolve into complacency. This is a critical turning point. Complacency manifests as reduced vigilance, shortcuts, and a diminished respect for procedures. At Chernobyl, complacency was evident when operators ignored safety protocols, disabled automatic shutdown systems, and continued the experiment despite unstable reactor conditions. The absence of prior accidents reinforced the belief that “nothing will go wrong.”
Finally, the system reaches vulnerability. At this stage, the safeguards, both technical and human, have been weakened. Small errors or unexpected conditions can now trigger major failures. During the Chernobyl test, the combination of design flaws, procedural violations, and operator complacency created a highly unstable situation. When the reactor entered a dangerous state, the response actions triggered a massive power surge, leading to explosions and the release of radioactive material.
This sequence highlights an important point, accidents like Chernobyl are rarely caused by a single mistake. Instead, they result from a chain of gradual changes in behavior and system interaction. Each stage, reliability, familiarity, dependency, complacency, and vulnerability, seems harmless on its own. Together, however, they erode the layers of defense that are meant to prevent disaster.
Understanding this degradation sequence is essential for improving safety in complex systems. It emphasizes the need for constant vigilance, strict adherence to procedures, and a culture that challenges assumptions rather than reinforces them. Even the most reliable systems require respect and critical oversight. Without that, reliability itself can become the first step toward failure.