I discussed in a previous post the law of least effort and its detrimental effect on our decision-making abilities. To recap very briefly: our brains tend to take the most undemanding course of action if presented with numerous ways of achieving much the same objective, as a result of which we are wont to believe that choices come ready-made and that nearly every problem, however, serious or complicated it might be, can be solved at speed.
This innate unwillingness to consider a comprehensive array of would-be solutions almost inevitably leads to a rush to judgment. We act in haste and repent at leisure, and all of the “what ifs” that we might usefully have contemplated before surrendering to our knee-jerk inclinations duly return to haunt us when our imprudence eventually becomes clear.
One reason why such a bent for quick-fix answers so often proves damaging is that initial shortsightedness is frequently accompanied by a subsequent reluctance to admit a change, of course, is urgently required. Some of the most celebrated leaders in history have succumbed to this double whammy: witness, for instance, Napoleon’s disastrous determination to persist with his invasion of Russia in the face of repeated and ultimately decisive setbacks.
While Napoleon very probably regarded himself as answerable to no-one, most of us accept we are in some way accountable for our less successful choices. It is true, too, that accountability is extremely important. Yet there is a marked difference between accountability and blame culture, and it is the latter that defines many modern-day organisations and which routinely plays a pivotal role in ensuring that mistakes, rather than being learned from, are ignored or even perpetuated.
By way of illustration, let us briefly examine two institutional responses to error. Both are from the same sphere, which makes for a direct comparison. Each revolves around a tragedy in which more than a hundred people died. Together they underscore the vast chasm between an open-minded philosophy that addresses poor decisions and a corrosive ethos that in many ways merely encourages even more of them.
A culture of fear
On April 25 2005 a commuter train left the track and ploughed into a block of flats in the industrial city of Amagasaki, Japan. An official inquiry concluded that the driver had been trying to make up time when he lost control.
“Lost time” is a painfully relative term here. The reality is that the train was just 80 seconds behind schedule. What manner of prospect was sufficiently daunting for the driver to infer that endangering his own life and the lives of his passengers would be preferable to arriving little more than a minute late?
The answer surely lies in the severity of the penalties he faced for failing to keep to the timetable. He would have been not just heavily fined but sent on an “education” course that reportedly prized humiliation over training. Small wonder that even in the seconds immediately before impact he applied only the service brake, knowing that use of its emergency counterpart would invite additional punishment.
A culture of improvement
On December 29 1972 Eastern Air Lines Flight 401 plunged into the Florida Everglades en route from New York to Miami. The crash happened after every member of the flight crew became preoccupied with an unlit landing gear indicator.
Their unswerving focus on this particular issue blinded them to another that merited much greater attention: the autopilot facility had been inadvertently turned off, as a consequence of which the plane was gradually descending to its doom. Only afterwards did crash investigators discover there had been nothing wrong with the landing gear and that the indicator’s light had simply burnt out.
Motivated by this incident and others caused principally by human flaws rather than technical faults, the aviation industry introduced crew resource management. An all-encompassing set of new principles and procedures, CRM quickly helped cement air travel’s standing as the safest form of mass transportation.
Accountability as a force for good
The first of these examples highlights the perils of an organisational mindset that refuses to acknowledge that to err is human. When the prevailing attitude is that someone must carry the can, that the buck must stop somewhere, there almost invariably exists an unnecessary pressure that in the end serves only to contribute to the bad decisions for which individuals are later condemned.
By contrast, the second example emphasises the value of an organisational mindset that is prepared to accept not only that most of us are wrong more often than we are right but that errors, whether minor or catastrophic, are essential to progress. It is these organisations that benefit staff and stakeholders alike by avoiding a debilitating cycle of unfortunate choices, unhappy outcomes, and unthinking censure.
Yes, accountability has its place. There is no doubt about that. But we would do well to recognise that it means precious little if we decline to use it as a tool for improvement and instead employ it only as a weapon for heaping fresh misery on those who “called it wrong”.
Martin Binks is the former dean of Nottingham University Business School and a Professor of Entrepreneurial Development at its Haydn Green Institute for Innovation and Entrepreneurship. firstname.lastname@example.org