|J.R. "Bob" Dobbs|
Perrow first coined the term "system accident" when this book was released in 1984. Since then, much of the examples and analysis in this book has been turned into a field of study known as Normal Accident Theory.
I'm going to suggest that, though even the public has come to realize that there is rarely a single cause to a catastrophe, the tendency is still there for the blame to descend to the operator's level, while the architects and designers get off scot free. Witness the financial collapse of 2008, as a prime example, and if that isn't a good argument for mandatory capital punishment for white collar crime, I don't what is.
As such, I realize that there are multiple sources for disasters, but still, when it comes down to it, I can't help but shake the feeling that most of them boil down to incompetent management. Incompetent fuckups who think they know what they are doing. When it comes to maritime accidents, one could blame the
Couple this incompetence with a willful ignorance to evidentiary facts, and you get a modeling system inside that bony carapace of a skull that almost invariably ignores dangers for a more rosy rainbow unicorn assessment of the situation (and who does a better job of that than clueless executives?):
"Why would a ship in a safe passing situation suddenly turn and be impaled by a cargo ship four times its length? For the same reasons the operators of the TMI plant cut back on high pressure injection and uncovered the core. Confronted with ambiguous signals, the safest reality was constructed".So, this has been addressed in one form or another in the form of increasingly sophisticated risk assessment methods, but Perrow has suggested that some areas (especially, I would note, those that can be automated), might best be abandoned as human pursuits. Nuclear power and weapons being one of them.
I disagree, one thing I note is that if any area has had more incompetent monkey clusterfuck accidents with more devastating results, it is the area of nuclear weapons. And yet, if the systems for deploying and maintaining nuclear weapons are highly complex and tightly coupled are accidents waiting to happen, then wouldn't the past 70 years suggest the are not a Great Filter?
So, I would quintuple the quality control on bonehead executives, or at least, given their pay has increased some 350% over the past 30 years, subject them to 350 % more scrutiny than they suffered some 30 years ago. Public executives, not so much the problem, as they are already under public scrutiny. It's those private bastards that are going to wreck things for everyone. And so long as they insist on socializing the losses and privatizing the gains, they need to be kept ground under thumb, just to keep the hanky-panky to a minimum.
The other obvious choice is to loosen the coupling on the internal components so many of these systems. Tight coupling makes them more efficient, but it also makes them more brittle. Having a little inefficiency is not necessarily a bad thing. As such, the term "slack" comes to mind. Though this is considered an ineffable term within the Church of the SubGenius, I now suggest it be incorporated into every single human design element to the end of time.
Oh, right, and the accidents detailed in the book are highly amusing, and worth the read.