Case Study: When Does a Mistake Stop Being Honest?*

When does a mistake stop being honest? There is no clear a priori line. It isn’t as if it would be patently clear to everybody involved that there is a line that has been crossed, and once it has, the mistake is no longer honest. Rather, it depends very much on how the story of a mistake is told, and who tells it. Here I want to recount one such “mistake,” a mistake that got turned into a crime, first a “crime” of sorts by the organization itself, and then by a prosecutor (who, in this particular country, happens to work for the regulator). Who gets to draw the line? Who gets to tell the story? These questions are absolutely critical in understanding how we end up with what might be seen as unjust responses to failure.

The incident happened on the morning of November 21, 1989, when a Boeing 747 on an instrument approach in heavy fog came very near to crashing at London Heathrow Airport. The big airliner had been misaligned with the runway so that, when it started pulling up for a second go, it was actually outside the airport’s perimeter fence, and only about 75 feet off the ground. It narrowly missed a nearby hotel, setting off car alarms all over the parking lot and fire sprinklers in the hotel. The second approach and landing were uneventful. And most passengers had no idea they had been so close to a possibly harmful outcome. For them, the flight was now over. But for the captain, a veteran with 15,000 flight hours, it was only the beginning of a greater drama. Two and a half years later, a divided jury (10 to 2) would find him guilty of negligently endangering his aircraft and passengers—a criminal offense. He would lose his job, and then some.

I reanalyze an account of the incident as masterfully told by Wilkinson15 to illustrate the tension and distance between different interpretations of the same event. Was it a mistake culpable enough to warrant prosecution? Or was it normal, to be expected, all in a day’s work?

We can never achieve “objective” closure on these questions. You can only make up your own mind about them. Yet the case raises the fundamental issues about a just culture: How can we produce explanations of failure that both satisfy demands for accountability and provide maximum opportunities for organizational learning?

A BUG, WEATHER, AND A MISSED APPROACH

Wilkinson describes how the captain’s problems began at a Chinese restaurant in Mauritius, an island in the Indian Ocean off Africa. Together with his flight deck crew, a copilot and flight engineer, he dined there during a layover before flying on to Bahrain and then to London. The leg from Bahrain to London would be the last portion of a trip that had begun in Brisbane, Australia.

Several days later, when the flight had gotten to Bahrain, both the copilot and flight engineer were racked with gastroenteritis (stomach flu). The captain, however, was unaffected. A Mauritian doctor had given the flight engineer’s wife tranquilizers and painkillers. She also was on the trip and had dined with the crew. The doctor had advised the flight engineer to take some of his wife’s pills as well if his symptoms got worse. Flight crews, of course, can’t just take advice or prescriptions from any doctor, but this man had been suggested by an airline-approved physician, who was too far away but had recommended the examining doctor to the crew. He would soon be added to the airline’s list anyway. He did not, however, seem concerned that the crew had been scheduled to fly in a few days’ time again. A colleague pilot commented afterward:

This was apparently a doctor who didn’t even understand the effects of self-medication in a pressurized aircraft on the performance of a complex task, and right there is a microcosm of everything that pressured the crew to get the job done. That doctor’s vested interest is in sending flight crews out to fly. Certainly if he ever expects to work for the airline again, he isn’t going to ground crews right and left. The company wants you to fly. (As part of the court case years later, however, the captain would be accused of violating the company’s medical procedures.)

The subsequent flight to London was grim. Unexpected headwinds cut into the 747’s fuel reserves, and the copilot had to leave the cockpit for several hours after taking some of the flight engineer’s wife’s medicines to control his symptoms. It left the captain to fly a stretch of five hours alone, much of it in the dark.

LONDON FOG

Over Frankfurt, the crew heard that the weather at London Heathrow Airport was bad. Thick fog meant that they probably would have to execute a so-called Category III instrument approach. In Category III conditions, a 747 is literally landing blind. While the wheels may just have poked out of the fog in the flare, the cockpit, considerably higher, is still in the clouds. Category III approaches are flown by the autopilot, with the crew monitoring the instruments and autopilot performance. The autopilot captures two radio beams (a localizer for lateral and a glideslope for vertical guidance). These are transmitted by the instrument landing system on the ground, and the autopilot translates them into control commands to make the aircraft stay on track and on a gradual descent, exactly toward and onto the runway. At least, that is the idea.

The captain, like most airline pilots, had never flown a Category III approach down to minimums, despite his extensive instrument experience. The copilot, new with the airline, hadn’t either. He had not even had the mandatory training for a Category III approach, and was not approved to fly one. But that was not going to stop anything. Still over Germany, the captain had gotten in touch with the airline and requested permission for the copilot to help out on this one approach into London to get them home. Dispensation was granted. It routinely is. It almost always is. The captain, however, never volunteered that his copilot was not in the best of states (in fact, he may not have been in the cockpit at that very moment). Nobody on the ground inquired either.

Later, the copilot testified that nobody had asked him if he wanted a dispensation. But even if he’d been asked, it would have been difficult to refuse. “I accepted, with the airline’s interests at heart, the dispensation to operate to category III auto-land conditions,” he later wrote to the court. “I personally would not mind if we had diverted. But what would the airline have said to the captain if he had diverted without asking for a dispensation? What would they have said to me if I had not accepted it?”

He really had been in a bind. Wanting to help the airline, wanting to get its passengers home, the copilot had agreed to go on with the flight. But he was sick, really. So if the flight would have had to divert because he was feeling too poorly to do a Category III approach, this once, what was he doing on board anyway? And where had those medicines come from?

“This,” Wilkinson, observed, “is the heart of the professional pilot’s conflict. Into one ear the airlines lecture, ‘Never break regulations. Never take a chance. Never ignore written procedures. Never compromise safety.’ Yet in the other they whisper, ‘Don’t cost us time. Don’t waste our money. Get your passengers to their destination—don’t find reasons why you can’t.’”

THE APPROACH

Nearing London, the 747 was given a routine holding northeast of the airport. After some time of flying racetracks in the holding pattern, the flight engineer suggested, “Come on, we’ve got two minutes of holding fuel left, let’s buzz off to Manchester.” The crew discussed the options—both Manchester and Gatwick (south of London) were diversion airports, though Manchester had better weather. But the captain “was a very determined man,” as the flight engineer recalled. Just as he was deciding to head off to Manchester, Heathrow called and cleared the 747 for approach.

But a complication had arisen: instead of landing to the east (runway 09), as had been planned, they would now have to turn in shorter and land toward the west (runway 27), because the wind had changed. The approach became a hurried affair. The crew had to reshuffle charts, think and talk through the procedures, revise their mental pictures. A 10-knot tailwind at altitude meant that the 747 was motoring down the approach path toward the runway at an even greater groundspeed. Tightening their slack further still, the approach controller turned the 747 onto the localizer 10 miles from the runway, rather than the normal 12 miles or more. Halfway down, the tower radioed that some approach lights were apparently not working, requiring the flight engineer to take a quick look through his checklist to see how this, if at all, affected their planned procedure. The tower controller also withheld clearance for the 747 to land until the last moment, as a preceding 747 was feeling its way through the fog, trying to find its turnoff from the runway.

But the autopilots really were about to become the final straw: they never seemed to settle onto the localizer, instead trundling back and forth through the beam, left to right. The two autopilots on this old, “classic” 747 may never have been able to capture the localizer: when the aircraft turned in to start its approach, the autopilots disconnected for some time and the airplane was flown manually. The autopilots, built by Sperry, were based on an earlier design. They were never really meant for this aircraft, but sort of “bolted on,” and had to be nursed carefully. On this flight the crew made a later attempt to reengage the autopilots, though radar pictures showed that the 747 never settled on a stable approach path.

The flight engineer was getting worried about the captain, who had basically been flying solo through the night, and still was alone at the controls. The copilot was of little help. “I was not qualified to make this approach and could not make any suggestions as to what was wrong,” he would later tell safety investigators. He stayed out of the way.

The captain was now technically illegal: trying to fly a Category III approach with autopilots that refused to settle down and function perfectly was not allowed. The right decision, to everybody in hindsight, would have been to go around, to fly what’s called a missed approach. And then to try again or go to the alternative. “I’d have thrown away the approach, gone to my alternate or tried again. No question about it,” one pilot questioned by Wilkinson said.

But other pilots, some with the same airline, believed the opposite. “Look, he was concerned about fuel. He had a first officer who was no help. He knew a diversion to Manchester would cost the airline a minimum of $30,000. He realized he’d be sitting in the chief pilot’s office trying to explain how he got himself into a position that required a missed approach in the first place. He figured the autopilots would settle down. And I’ll bet he was convinced he’d break out at Category I limits (a higher cloud ceiling and better visibility than Category III) and could take over and hand-fly it the rest of the way. I can understand why he carried on.”

It might have worked, Wilkinson observed. And if it had, nobody would ever have heard of this case.

But it did not work. Ever concerned with passenger comfort, the captain waited with making a go-around. And then he made a gentle one. The 747 sank another 50 feet. The flight engineer glimpsed approach lights out the left window as they started pulling up, away.

As one 747 instructor said, “This is a pilot who was critically low on fuel, which probably was one reason why he waited a second before going around. At decision height on a Category II approach, you look to see the slightest glow of approach lights, you wait ‘one-potato,’ see if anything comes into sight. Perhaps a thousand times before, he’d watched that same autopilot do strange things on the same approach to the same airport, and he’d break out at two hundred or five hundred feet and make a play for the runway. And on the crew bus everybody says, ‘Boy, that autopilot sucked again today.’”

On climb-out after the first try, the copilot noticed how the captain’s hands were shaking. He suggested that he fly the second approach instead, but the captain waved him away. The second approach was uneventful, and was followed by a landing that elicited applause in the passenger cabin.

NO DISCLOSURE, BUT A TRIAL

Back in the crew room after they had shut down the airplane, the captain found a note in his company letterbox. It requested that the crew see the chief pilot. The captain told the copilot and flight engineer to go home, and he would say that they had already left when he found the note.

But he did not go to the chief pilot either. Nor did he talk to an airline safety investigator about what had happened. Instead, he drove straight home and went to bed. That evening, a call came from the airline. The crew had been suspended.

An internal investigation was launched by the airline, who later issued a report chiding the copilot and flight engineer. The airline also demoted the captain to first officer. The aviation authority downgraded his license accordingly, and he was relegated to riding out the rest of his career in the right seat, no longer in command.

This was too much. Half a year after the incident, the captain resigned from the airline and began to appeal the authority’s reduction of his license. Some did not see any problem. Recently, the pilot had been receiving grades of “average” on his half-yearly proficiency checks in the simulator, and instructors had taken note of his inability to perform well under pressure.

But why did the regulator take him to court? This “remains the subject of speculation,” Wilkinson writes. “There is considerable feeling that the airline was not sorry to see it happen, that the captain was a loose cannon who could have made things awkward for an airline that places great value on its public image. Some feel that the captain could have revealed some controversial company procedures. If the captain were branded a criminal, it would effectively negate whatever damage he might do… Others suspected empire building within the regulator’s legal branch: this looked like a juicy case for an aspiring prosecutor to take public and demonstrate that even the flag carrier’s jumbo jet captains dare not take on the aviation authority casually.”

Six weeks after the incident, the airline had announced that it was no longer granting bad-weather dispensations. But the fleet manager who had authorized the approach with the copilot’s dispensation was not in the dock. Nor was the controller who turned the big 747 onto a tight approach, separated by what seemed like only five miles rather than the legal minim of six from the preceding 747. With traffic from all over the world converging onto London at eight in the morning, those rules were obviously allowed to be flexible.

It was the pilot who was in the dock. Seated next to a policeman. Why had he not filed a Mandatory Occurrence Report right after the flight? Because it did not constitute an occurrence, the pilot argued. After all, he had gone around, or at least initiated a go-around, and landed uneventfully the second time. Why had he gone around so slowly? Because the supposedly canonical technique was not described anywhere, he argued. At some point in the trial, the pilot produced a transcript of every oral call-out, checklist response, and radio transmission that company and government regulations required the crew to accomplish during the approach. It showed that the entire routine took seven minutes. The approach had lasted only four, making it technically impossible to make an approach and follow all applicable rules at the same time.

Few cared. Jurors sometimes even napped. If the trial did not revolve around arcane legal points, it did so around finely grained technical ones. The pilot was never called to testify on his own behalf.

The defense elaborated the fact that the old 747 was dispatched on its next leg out of London without a check of the autopilot, to see if it was somehow faulty. To this day, four crucial pages of the maintenance log, which might have told something about the autopilot, are missing (in a parallel to the prescription missing from the medication log in Mara’s case; see Chapter 3).

“The regulator itself was at fault,” a legal expert and airline pilot commented, “for permitting a situation to exist in which the airline’s flight operations manual contained a provision that the captain would be expected to use, by which it could authorize him to make the approach without a qualified copilot. The approach was actually illegal at the fault of the airline, yet they were not charged. Had that provision not existed, the captain would have diverted to Frankfurt with cozy fuel reserves, to await better weather at London.”

A split jury found the pilot guilty. The judge fined him only £1500 and rejected the regulator’s demand that he pay £45,000 more to cover court costs. The pilot appealed the decision, but that was summarily rejected.

When he was young, the pilot lived near an air force base where he would watch airplanes take off and land at the end of the war. That inspired him to become a pilot. “On December 1, 1992, three years and nine days after the incident, the pilot left home without a word to his wife. He drove some nine hours to a beach near the air force base. There he ran a hose from his car’s exhaust pipe through a nearly closed window. In a matter of minutes he was dead. He left no letter or any explanation.”

It would be too easy to ask whether the prosecution and conviction of the captain was right. Or just. Because it is too difficult to answer. Was this a crime?

Multiple descriptions of the events are plausible. The disappearance of documents without a trace in these cases can always give people the chills. Was it a conspiracy after all, a “cover-up,” as some of Wilkinson’s interviewees suggested? It could have been: turning one pilot into a highly visible scapegoat to silence him and others. This would save the reputation of both the airline and the regulator, who also happens to employ the aviation prosecutor in this country. But conspiracies take tight coordination and demand iron discipline from those involved in them.

Also, as a captain, this pilot had lately been “average,” not stellar. He was stubborn and determined. He was ultimately responsible for getting himself and his crew into this jam. And then he apparently refused to cooperate and did not want to disclose or discuss the incident (it wasn’t an occurrence to him, after all) until forced to do so in the adversarial setting of a trial.

Who is right? Whose version of event is true? The tension between multiple possible interpretations remains until the end of Wilkinson’s story. But important points about building a just culture do stand out.

•  A single account cannot do justice to the complexity of events. Like the physicist Niels Bohr when he tried to convince his colleagues at the time—Einstein and Heisenberg—we need multiple layers of description, partially overlapping and always somehow contradictory, to have any hope of approximating a rendition of reality.

•  A just culture accepts nobody’s account as “true” or “right” and others as wrong. This only leads to moral grandstanding, imperialism, and to losing situations, like this pilot’s. Instead, it accepts the value of multiple perspectives, and uses them to encourage both accountability and learning.

•  A just culture is not about absolutes, but about compromise. Achieving justice is not about black and white. Instead, it presumes compromise. Justice in a just culture cannot be enforced; it must be bargained. Such bargaining for justice is a process of discovery, a discovery that the best bargain may be an outcome in which every party benefits, for example, an explanation of events that satisfies calls for accountability and helps an organization learn and improve.

•  A just culture pays attention to the “view from below” among these multiple accounts, as that view (in this case from the person in the dock) may have little or no power to assert itself and is the easiest to quash. Silencing it can be organizationally or politically convenient. You may even see it as imperative. You may see putting others in an inferior position as a necessary, if sometimes annoying step in achieving other goals. But this makes it even more morally essential to give the view from below a voice.

•  A just culture is not about achieving power goals, by using other people to deflect attention away from one’s own flaws. This denies such people their personhood; it makes them a mere instrument in the pursuit of protection of power, of existing structures or arrangements. Most people will see this as unethical, and it violates the basic principles of Aristotelian justice that many of our societies still live by.16

•  Disclosure matters. Not wanting to disclose can make a normal mistake look dishonest, with the result that it will be treated as such. Multiple examples in this book illustrate this. Disclosing is the practitioner’s responsibility, or even duty.

•  Protecting those who disclose matters just as much. The demand to disclose in the pilot’s case above (a note in the letterbox) may not have given him confidence that honest disclosure would be treated fairly. Conditions at his airline may have been unfavorable for honest disclosure. Creating a climate in which disclosure is possible and acceptable is the organization’s responsibility. And more protections are often necessary.

•  Proportionality and decency are crucial to a just culture. People will see responses to a mistake as unfair and indecent when they are clearly disproportionate. “What was the guy found guilty of?” a pilot friend had asked Wilkinson in amazement. “Endangering his passengers,” Wilkinson replied. “I do that every day I fly,” the friend said with a laugh. “That’s aviation.”15 The eventual punishment given to this pilot (a symbolic fine) may have indicated that the trial was seen as a disproportionate response to an event that perhaps should not have ended up in court. Proportionality means heeding Martin Buber’s dictum: “What is necessary is allowed, but what is not necessary is forbidden.”

By the time a case reaches trial, much of the preceding has either been wasted or rendered impossible. A trial cannot do justice to the complexity of events, as it necessarily has to pick one account as the truest or most trustworthy one.

A seeming lack of honest disclosure is often a trigger for a trial. This could have been the case here. You can also see it in the literature on medical lawsuits. Patients or their families do not typically take a doctor to court until they feel that there is no longer any other way to get an account of what went wrong.17 Stonewalling often leads to a trial. But a climate that engenders anxiety and uncertainty about how disclosure will be treated often leads to stonewalling. The more we take cases to trial, the more we could be creating a climate in which freely telling each other accounts is becoming more and more difficult.

*  This case study is taken from Ref. 15.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset