There was a lot of comments on the original article about accountability sinks. People seem to feel strongly about human judgement being replaced by formal processes. So let’s dive in into the comments!

Ben Pace writes:

I liked reading these examples; I wanted to say, it initially seemed to me a mistake not to punish Wascher, whose mistake led to the death of 35 people [during the plane crash at LAX airport].

I have a weak heuristic that, when you want enforce rules, costs and benefits aren’t fungible. You do want to reward Wascher’s honesty, but I still think that if you accidentally cause 35 people to die this is evidence that you are bad at your job, and separately it is very important to disincentivize that behavior for others who might be more likely to make that mistake recklessly. There must be a reliable punishment for that kind of terrible mistake.

That’s a completely understandable position, but when you read about what actually went wrong at LAX, the argument about disincentivization loses some weight (from the original Asterisk article):

1. LAX was equipped with ground radar that helped identify the locations of airplanes on the airport surface. However, it was custom built and finding spare parts was hard, so it was frequently out of service. The ground radar display at Wascher’s station was not working on the day of the accident.

2. It was difficult for Wascher to see Intersection 45, where the SkyWest plane was located, because lights on a newly constructed terminal blocked her view.

3. After clearing the USAir plane to land, Wascher failed to recognize her mistake because she became distracted searching for information about another plane. This information was supposed to have been passed to her by another controller but was not. The information transmission hierarchy at the facility was such that the task of resolving missing data fell to Wascher rather than intermediate controllers whose areas of responsibility were less safety-critical.

4. Although it’s inherently risky to instruct a plane to hold on the runway at night or in low visibility, it was legal to do so, and this was done all the time.

5. Although there was an alarm system to warn of impending midair collisions, it could not warn controllers about traffic conflicts on the ground.

6. Pilot procedure at SkyWest was to turn on most of the airplane’s lights only after receiving takeoff clearance. Since SkyWest flight 5569 was never cleared for takeoff, most of its lights were off, rendering it almost impossible for the USAir pilots to see.

Firing Wascher would not help much to solve these issues. Fixing the underlaying problems, on the other hand, would. After all, if your approach to safety is hiring only the supermen who make no errors, then you have already failed. On the other hand, threatening Wascher and, in effect, all air traffic controllers, would lead to cover-ups and would make identification of those underlaying issues much harder.

All that being said, it’s obvious that blameless culture does not work in some contexts. Specifically, it does not work, when there is a huge incentive to cheat. If blameless culture was used, say, in a governmental procurement agency, pretty soon all the contracts would be awarded to the friends and the relatives.

However, I used to feel that we can get away with the blameless culture in the cases where the incentives of the organization and the employee are aligned. For example, neither the doctor, nor the hospital wants patients to die. Hospital can therefore assume no ill intent on the part of the doctor. Or consider the pilots. Neither they, nor the airlines want planes to crash…

But other comments made me question my assumptions. Here’s one from philh:

Some people really are incompetent or malicious. […] A blameless postmortem of Royal Air Maroc Express flight 439 would presumably have left that captain still flying? At any rate, as long as "the captain gets fired" is a possible outcome of the postmortem, the captain has incentive to obfuscate. (Which he tried unsuccessfully. Apparently we don't know if he was actually fired though.)

That was intriguing enough to make me read the linked article and oh, boy:

The weather conditions were now even worse than on their previous approach, with a cloud ceiling at 600 feet and fog off the coast stretching all the way down to sea level.

The descent was fast and steep from the beginning. As they descended toward 6,000 feet, the Captain instructed the First Officer to use an excessive descent rate of -1,800 feet per minute, and their airspeed was 230 knots, far above normal. There was no obvious reason for this; apparently, it was just how the Captain liked to fly. “You fly, I will watch the speed and the water,” he said to the First Officer, betraying his intention to keep descending until he caught sight of the Mediterranean Sea. He then switched off the EGPWS [ground proximity warning system], just as they had briefed on the ground in Tangier.

[…]

The plane reached a height of 445 feet. Still no sign of the runway. “We keep going now, keep going,” the Captain said. “Go, go, we keep going.” He reached over to the autopilot panel and changed the descent rate back to -1,800 feet per minute, which was beyond reckless, given that they were now at a height of 310 feet. At this point the ground proximity warning should have sounded, but it didn’t because the pilots had turned it off.

The First Officer immediately reached over and changed the descent rate back to -1,400 feet per minute. “This is not normal!” he exclaimed, becoming increasingly terrified as the plane hurtled toward the sea below. “Now take it manual,” he muttered to himself in Arabic. At a height of 80 feet above the water, he disengaged the autopilot and began to pull up.

But the Captain had other ideas. “Oh, yeah this is fine,” he said, grabbing the control column to pitch the nose down! Even at 80 feet above the ground, he’d be damned if he let his First Officer level off without seeing the water. They were going until it was visible, no matter what!

Now the First Officer was pulling up while the Captain pushed down, creating a massive force differential on the linked control columns as the pilots fought against each other for control of the airplane. But the Captain was stronger, and he held the plane in a descent for several more seconds as the First Officer tried desperately to overpower him. At 35 feet the First Officer advanced the throttles to high power, but it was already too late. At 8:03 and 53 seconds, the ATR-72 slammed into the water with a force of 3.2 G’s, causing it to bounce back into the air. Incredibly, the Captain kept pitching down. Two seconds later they hit the water again, this time pulling nearly 4 G’s, far beyond the structural limits of the landing gear.

“Oh, merde,” the Captain exclaimed, pulling back sharply on the controls. “Merde!

You are left wondering whether reckless human behaviour can get any more terrifying. But those doubts are promptly resolved when you move to the comment by Brinedew:

January 22nd: Dr. Chiba's heart sinks when he learns that Matsui has pressured yet another patient into surgery. The patient is 74-year-old Mrs. Fukunaga, and the procedure is a laminoplasty—the same one that left Mrs. Saito paralyzed from the neck down 3 months ago. 'Please let Matsui learn from his mistakes,' Chiba pleads. Knowing that Matsui's grasp of anatomy is tenuous at best, Chiba tries to tell Matsui exactly what he needs to do. 'Drill here,' Chiba says, pointing at a vertebra. Matsui drills, but the patient starts bleeding, constantly bleeding. He calls for more suction, but it's no use; blood is now seeping from everywhere. Matsui is confronted by his greatest weakness: the inability to staunch bleeding, the one skill that every surgeon needs. The operating field is a sea of red. As sweat rolls down his face, Matsui is in complete despair. He knows he has to continue the surgery, so the only thing he can do is pick a spot and drill.

A sickening silence. Even Matsui can feel that something is wrong because his drill hits something that is definitely not bone. Dr. Chiba looks over and lets out a little whimper. Matsui has made the exact same mistake as last time: he's drilled into the spinal cord, and this time the damage is so bad that the patient's nerves look like a ball of yarn. There's actually video footage of this surgery. Yes, it really looks like a ball of yarn, and no, you really don't want to watch it, trust me. The footage ends with Matsui literally just stuffing the nerves back into the hole he drilled and hoping for the best. This was Matsui's most serious surgical error yet, and it would later come back to haunt him. But for now, all he got was a slap on the wrist. A month later, he was back at it. He was going to perform another brain tumor removal—the very first procedure he failed at Ako.

Brinedew dryly comments:

One aspect I found interesting: Japan's defamation laws are so severe that the hospital staff whistleblowers had to resort to drawing a serialized manga about a "fictional" incompetent neurosurgeon to signal the alarm.

Clearly, in the real world, there’s a need to balance accountability with the ability to cope with unintentional errors. Multiple commenters have pointed to the concept of “just culture”. Some of them have also mentioned the book by Sidney Dekker.

From the blurb at Amazon:

You will learn about safety reporting and honest disclosure, about retributive just culture and about the criminalization of human error. Some suspect a just culture means letting people off the hook. Yet they believe they need to remain able to hold people accountable for undesirable performance. In this new edition, Dekker asks you to look at 'accountability' in different ways. One is by asking which rule was broken, who did it, whether that behavior crossed some line, and what the appropriate consequences should be. In this retributive sense, an 'account' is something you get people to pay, or settle. But who will draw that line? And is the process fair? Another way to approach accountability after an incident is to ask who was hurt. To ask what their needs are. And to explore whose obligation it is to meet those needs. People involved in causing the incident may well want to participate in meeting those needs.

Feels relevant, but I haven’t read the book, so no guarantees.

The prize for the most insightful comment goes to hliyan:

1) About 8 years ago I was gifted a copy of Ray Dalio's Principles. Being a process aficionado who thought the way to prevent bureaucracy was to ground process in principles, I was very excited. But halfway through I gave up. All the experience, the observations, the case studies that had led Dalio to each insight, had been lost in the distillation process. The reader was only getting a Plato's Cave version. I used to love writing spec-like process docs with lots of "shoulds" and "mays" for my teams, but now I largely write examples.

2) I live in a Commonwealth country, and as I understand (IANAL), common law, or judge made law, plays a larger role in the justice system here than in the US, where the letter of the law seem to matter more. I used to think the US system superior (less arbitrary), but now I'm not sure. Case law seems to provide a great deal of context that no statute could ever hope to codify in writing. It also carries the weight of history, and therefore is harder to abruptly change (for better or for worse).

I was aware of the value of practical examples in education — a tradition going back to Comenius and his School by Play — and in conveying knowledge more broadly. After all, my original article was essentially a collection of practical examples. But I had never considered the idea of using examples to set rules, whether at the workplace or in legal contexts, instead of relying on unambiguous, cut-and-dried laws and processes.

And it’s not only that an examples are easier to relate to. They have a nice property of not spelling out the rule directly, just showing one instance of it. It is left to the practitioner to infer the underlying rule. Instead of saying “In situation X, do Y,” it says “Here are some examples of how a decent human being would behave in situations similar to X." Heck, there can even be multiple mutually contradictory examples making it clear that in the end, its up to every individual to interpret the spirit of the law rather than to blindly follow its letter.

New Comment
2 comments, sorted by Click to highlight new comments since:

The prize for the most insightful comment goes to hliyan:

  1. About 8 years ago I was gifted a copy of Ray Dalio’s Principles. Being a process aficionado who thought the way to prevent bureaucracy was to ground process in principles, I was very excited. But halfway through I gave up. All the experience, the observations, the case studies that had led Dalio to each insight, had been lost in the distillation process. The reader was only getting a Plato’s Cave version.

Perhaps there’s a reason for that…

Oops, I didn't send my reply comment. I've just posted it, yes, that information did change my mind about this case.

Curated and popular this week
OSZAR »