The Buzzkill Argument

Getting risk/reward judgments right is a task of the ages. We only seem to learn from them we things go wrong.

I don’t know about you, but I sometimes grow weary of the self-imposed burden of being an Ambassador for General Aviation. I mean, I’m not gonna bring a crack pipe to the Young Eagles picnic, but I’m sure as hell not going to be the eternal gladhander, either. Other people are better at that sort of thing than I’ll ever be.

On the other hand, I try to conduct myself so as not to do something really stupid or at least get caught at it. So I have a certain inner tension with my role as a writer and analyzer and my innate comfort level with risky things. I once heard the couch rats at an airport I used to fly out of say, “that dude’s kinda crazy.” They were referring to me being enthusiastic about taking my instrument students out into northeastern winter weather. An Aunt Jane I ain’t. Yet today.

Still, I notice something in our comments from time to time that I call the Buzzkill Argument. It’s usually in reaction to some cautionary essay about risk judgment or decision making in accident scenarios. The gist of it is the classic edge-case argument: “Why don’t we just outlaw everything risky?” In other words, the pushback against confronting risk is killing all the fun and we’re wrecking general aviation as a result.

I’ve heard the argument before and I’ve made the argument before because I am, after all, a guy who jumps out of airplanes and rides motorcycles, sometimes on racetracks. The buzzkill argument surfaced most recently in Roy Evans’ blog about accidents in STOL competitions and in my essay about a young woman’s project to fly around the world in an LSA sans instrument equipment or a rating. I could find other examples.

I’ve always thought there is a molecular-thin membrane between understandable and acceptable risk and wild-eyed, mouth-foaming stupidity. And it’s movable, variable by the person making the judgment, by day and the angle of sunlight hitting what minimal facts may be extant to extract a judgment.

When the decision is murky, which it frequently is, the Buzzkill Argument can turn red to green in the name of fun because, after all, we’re all reaching for the gusto, right? We blather on about the phrase “safety culture” but most of us, including me, can’t always accurately define it even if we’re pretty sure we know what it means. Including me. Buzzkill is antagonistic to safety culture because in marginal circumstances, I think it erodes the discipline to say, “I probably could do that, but I’m not going to.”

It can also obscure the wisdom often buried in the routine, traditional and accepted way of doing things. Like safety wiring fasteners, using fire sleeves where you’re supposed to, doing mag inspections, sumping fuel and a long list of mundane tasks that, taken together, form the foundation of survival. When this stuff is ignored or compromised, so goes safety culture, if it was ever there at all. Psychologists even have a phrase for it: normalization of deviance. That’s just fancy talk for the wink and nod we’ve all seen—or done—in our flying careers that says, “yeah, I know this isn’t right, but I think I can get away with it.”

If it goes far enough, you get a crash like the Collings Foundation B-17 fatal at Windsor Locks, Connecticut, in 2019. Ahead of that crash, the Buzzkill Argument was in full flower. We should allow people to fly on 75-year-old warbirds and have their fun, accepting whatever risk they want. So we did and five unsuspecting passengers who surely had no inkling of the risk they were assuming died as a result. Collings had a reasonable safety and oversight program in place—the nuts and bolts of a safety culture that was FAA approved—which it failed to follow, breaking faith with passengers and the public and defining normalization of deviance.

In the context of STOL events, the stakes aren’t nearly so high since passengers aren’t involved and in most cases, we’re talking about bent metal, not broken bodies. Nonetheless, in my view, it’s worth stepping back and asking if we’re normalizing accidents and incidents that could be prevented and if there are things we could do to avoid them. So after the metal is bent, what needs to be done to prevent it happening again? One step is to understand if you normalize a risk that chronically results in incidents or accidents, have you lost the ability to draw the line anywhere?

This happens occasionally in skydiving and it happened to me yesterday. We were doing an ash scattering dive for a pilot and skydiving friend who many of us had known for years. He died earlier this month of natural causes. Ash dives are challenging because so much can go wrong, but getting it right is a sweet tribute to the departed. We got it somewhere in between.

The moment I stepped out of the airplane, I knew we were headed straight for a juicy cumulus cloud that it may have appeared we were going to miss. We didn’t. On the ground later, everyone had a good laugh about it, but my resistance to normalizing it was to suggest the next time we do this—and there will be a next time—we need to put our A-game spotter in the door. She was further forward. Spotting a load isn’t difficult, but at the narrow end of the judgment envelope where the clouds sometimes are, it takes confidence and discipline to put your foot down and take the airplane around for another pass if the situation is even remotely doubtful. Skydivers are the very definition of go fever but sometimes they have to be throttled back. (We’re subject to the same cloud clearance requirement of 91.155 as aircraft are. I don’t like to wink and ignore it.)

Skydivers like to wish the departed a fond “blue skies” and our friend Steve would have savored the delicious irony of our having deposited his remains inside a cloud. I might too if we hadn’t done it by overlooking something we shouldn’t have.