As a scuba diver who often explores new places, I can say that I have found myself in some dangerous situations, but I always made it back to the surface without facing any negative consequences. Does this mean that I never made any mistakes? Absolutely not: mistakes were made, and lessons were learned.
We can all agree that learning from mistakes is good, but sometimes, when mistakes happen and consequences don’t manifest themselves immediately, we run the risk of not noticing them, not learning from them, repeating them, and over time developing a false sense of confidence, which can drive us to believe that our repeated mistakes are actually good practices.
Why do we ignore mistakes? Because sometimes outcomes are positive even if we make mistakes. “I made it out of water even this time, this means that my dive was executed perfectly.” This is a common way of reasoning, but in reality, things are much more complex than that. There is a difference between correct execution and successful outcome, and the two should not be confused. In fact, everyone should know from experience that goals can be achieved even if the execution was sloppy and full of mistakes. Catastrophic consequences may happen if we fail to see that.
An example of the consequences of ignoring mistakes is given by the two space shuttle disasters: the Challenger disaster of 1986, and the Columbia disaster of 2003. Both these instances were caused by NASA leadership ignoring the concerns from the engineering teams. Problems that occurred in previous shuttle launches should have been a wake-up call for NASA leadership. Instead, all the previous successful launches and re-entries despite the problems were seen as accomplishments, and nourished the leadership’s overconfidence. “We have made it this time too, this means that all those concerns that engineers raised were excessive.”
The tendency of diverting from proper procedures, dismissing valid concerns, and ignoring problems, has a name: it’s called normalization of deviance. The driving force of normalization of deviance is overconfidence and the false belief that positive outcomes are inherently caused by correct executions.
Overconfidence and normalization of deviance can spread like a virus in an organization. It is important to be vigilant for signs of overconfidence in individuals, before it infects other people. I once had to deal with a manager who was a self-declared micromanager (and proud to be) but lacked technical foundations and knowledge of the product. He would consistently and quickly dismiss anything that he did not understand, and focus on short-term goals of questionable usefulness. Whenever his team would accomplish a goal, he would send a pumped-up announcement, often containing inaccuracies, and carefully skipping over the shortcomings of the solutions implemented. Given the apparent success of this management style, other managers started to follow his example. Soon after (in less than a year), the entire organization became a toxic environment where raising even the minimal concern was seen as an attack on the “great new vision”.
I see many parallels between this manager story and what is happening with ‘Twitter 2.0’ right now (although, I must say, in my case engineers did not get fired on the spot for speaking the truth). And with that manager, just like with ‘Twitter 2.0’, whenever problems occurred, those problems would either be ignored or blamed on the preexisting components built before the manager joined, never on the new, careless developments.
The truth however was that problems that occurred had been preannounced weeks, or months before, but concerns around them had been promptly dismissed due to being too challenging to address, and because “everything works right now, so that’s not a concern”.
The idea that everything must be correct because everything works, goals are achieved, and outcomes are successful, is a dangerous idea that can potentially have catastrophic consequences. It’s important to be critical and analytical, regardless of the outcome. This does not mean that success shouldn’t be celebrated, but that mistakes should be captured so that lessons can be learned from them, even if the final outcome was successful. Not learning from mistakes does not allow us to advance, and on the contrary can only lead us to repeat them. And if we keep repeating the same mistakes, sooner or later, those will have some negative consequences.
A common practice in the aviation industry is to write reports on incidents, close calls, and near misses, whenever they occur, even if the flight was concluded successfully and no injuries or damages occurred. These reports are collected in databases like the Aviation Safety Reporting System (which can be freely consulted online), so that flight safety experts and regulators can identify common failure scenarios and eventually introduce mechanisms to improve safety in the aviation industry. A key element of these reports is that they are not meant to put the blame on certain people, but rather focus on what chain of events led to a certain mistake. “Human mistake” is generally not a valid root cause: if a human was able to make a mistake, it means that a mechanism is missing that can either prevent the mistake or detect it before it causes any negative consequences.
Some companies in other industries have similar processes for writing reports or retrospectives when a mistake happens (regardless of the outcome), with the goal of finding proper root causes and preventing future mistakes. Amazon with its Correction of Error practice is a famous example.
I think introducing these practices in an organization can help to establish a healthy culture where finding mistakes and raising concerns is encouraged, rather than being oppressed. However these practices, by themselves, may not be enough to ensure that such a culture can be maintained over time, because people can always disagree on what is considered a ‘mistake’. Empathy is probably the key to a truly healthy culture that allows people to learn and advance.
There are also cases where we are aware of problems, and we see them as such, but we deliberately choose not to do anything about them. This is where resilience comes into play.
Resilience is generally a good quality to have. Resilience can give us the strength to go through long-term hardships, and can have positive effects on our tenancy and determination. But even resilience, when taken to the extreme, can be dangerous. Resilience can lead us to ignore problems, and not react to them. Resilience can make us tolerate a negative situation, without finding a proper strategy to cope with it.
Poor planning forces you to consistently work extra hours? Resist and keep going, until you burn out. The relationship with your partner doesn’t satisfy you? Resist and think that things will get better, while the relationship slowly deteriorates. Feel pain in your knee every time you run for more than 30 minutes? Resist and don’t go to see a doctor, the pain will go away, sooner or later… until you cannot run anymore.
When we let resilience become an excuse to avoid solving problems, we can end up in situations from which it’s difficult to recover.
It’s important to make a distinction between what is under our control and what is not. We can fix problems that are under our control, but in situations where we cannot directly change the course of things, finding an alternative strategy is the only way. Resisting and hoping that things will get better often does not give the expected outcome–on the contrary, it can be detrimental.
In the end, I think that the ‘practice’ of ignoring mistakes (because of the overconfidence built from successful outcomes) or ignoring problems (because of resilience taken to the extreme) are hidden time bombs, silently ticking, waiting for the right conditions before exploding. We need to be aware that just because things seem to work today, it doesn’t mean that we’re making the right decisions, and this can have consequences in the future. Being critical, analytical, empathetic, and honest is important to avoid these behaviors and the dangers that come with them.