Embracing Risk
If we want to become old pilots, we sometimes have to be bold ones, too.
It's often difficult to compare the risks imposed by different activities, but its reasonable to state flying a certified single-engine airplane for an hour on a severe-clear day isnt as risky as spending that same time performing low-level aerobatics in an Experimental airplane. At the same time, and according to John King of King Schools, youre more likely to have a fatality in a GA airplane than in a car when traveling the same distance. If the added risk exposure we get from flying didnt provide some benefit-more efficient transportation, for example, or pure enjoyment-we might not do it at all. But the simple enjoyment of boring holes in the sky and other benefits outweighs that risk for many of us.
The risk-versus-reward calculation is a major factor affecting our risk tolerance. The greater the reward, the more risk we are willing to tolerate. Admitting that aviating adds risks is the first step toward better management of them, and your own set should start with a planning process informed by this doom-and-gloom perspective. I know its a buzz-kill, but each flight has the potential for something to go wrong, including a unique profile of possible contributing factors. One question to ask yourself is what the NTSB report might contain. Sucha perspective can be a good way to hold in check the runaway set of events that might lead to a bad outcome.
Risk Tolerance
Risk tolerance can be thought ofas our personal threshold of what were willing to subject ourselves to, usually in exchange for some benefit. Humans all have differing degrees of risk tolerance: Sky divers, mountaineers, hang gliders and people who pursue adventure sports skew to the more risk-tolerant side of the curve. Agoraphobics and people who absolutely will not fly reside on the other side.
As aviators, we typically have demonstrated greater risk tolerance than our non-aviating friends, particularly those who are less than enthusiastic about getting in a plane with us. Flying, however, can trigger risk aversion in people who are otherwise risk-tolerant. My old climbing partner, Tim, was willing to bivouac with me in the mountains at 14,000 feet in the winter and canoe the Grand Canyon in a boat we salvaged off a bridge. Yet he was hesitant to do one lap in the pattern with me. Perhaps he knows me too well, but his perspective on the risk of flying was different than his assessment of other risks he routinely accepted.
Every aviator has a different threshold of comfort driven by experience, training and our own individual aviation-risk sensibilities. An example is a YouTube video, at http://bit.ly/Pfww, of a pilot landing a Piper Super Cub on a rocky outcrop on the shoulder of a mountain in the middle of nowhere. He skillfully lands in a few hundred feet or less and, after a pause in the video (where I suspect he recycled some coffee), he fires up and departs in even less distance.
Some would say it was an example of a well-calculated risk; others would say it was pushing the envelope too far. Predictably, comments on the video fell into three basic camps: attaboys, great for him, not for me and there are bold pilots and there are old pilots.
The point is we each have our own views of risk and willingness to accept risk for ourselves. Having others on board, however, changes our risk tolerance, or at least it should.
Expanding the Envelope
I take exception to the oft-quoted bromide that there are no old, bold pilots. If that were entirely true, there would be no pilots at all. Sure; it applies to the hold-my-beer-and-watch-this knuckleheads, but in my experience those outliers dont dominate the aviation community. They exist, but I dont think the NTSB reports reflect an outbreak of overly bold pilots becoming natural-selection events.
I would argue the opposite, that each of us individually made a series of bold moves to advance our skills. When you first soloed an aircraft, can you honestly say you didnt have an elevated heart rate when you turned final? What about your first night landing? First excursion to a new airport? Becoming a pilot requires bold steps that advance you along your journey.
To continue advancing your skills, you must continue to be bold. While we all are encouraged to have personal minimums, I also think it is important to have a plan for methodically expanding them. Wecan use our own experience and training to expand the envelope, we can add ratings and we can go get recurrent training. By thoughtfully pushing our own limits, we become better pilots.
One of my favorite Will Rogers quotes is, Good judgment comes from experience, and a lotof that comes from bad judgment. Exploring the edges of our own risk envelope requires good judgment and-perhaps more important-good luck. The reason luck is a factor is that sometimes we learn the most when things go wrong. You can practice emergency maneuvers and emergency checklists all youwant, but nothing clarifies that training like the real thing. Real life experience has been a much better teacher than most of my CFIs.
Some of my best training came from reflecting on my own experiences, including my own foolish mistakes. It also comes from reading about the mistakes of others in publications like Aviation Safety. In researching my recent article on whether more experienced pilots are safer (see Safer By The Hour? August 2013), one significant factor that improved pilot risk management performance was the commitment to ongoing training and continuous learning. Pilots who read safety articles, take courses and actively think about managing their risk tend to be safer than those who are disengaged.
While there is learning potential from surviving major screw-ups, you also have to be lucky. The Will Rogers quote is excellent, but he died in a plane crash with Wiley Post in 1935. You can only learn from mistakes you survive.
Owning Our Irrationality
While we usually think of ourselves as rational animals, we actually operate by intuition and emotion more than by facts and analysis, particularly when it comes to assessing and managing risk. For example, I dont ride motorcycles as a hobby because I think they are too dangerous, yet at I consider the risk comparable to flying a single-engine airplane. That is irrational. Risk perception is irrational. Understanding that the way we view some risks is irrational may make us safer.
The reason I prefer the risk of flying to the risk of riding a motorcycle has to do with my perception of voluntary versus involuntary risk. In my view, the greatest risk of riding a motorcycle is from others runninginto me. I feel I have greater control over the dangers from flying. But if the risks of two activities are the same, I shouldnt view one as more dangerous. I do, however, because thats how it feels to me. We evaluate risk by feel more than by our command of actuarial data and cold calculation of probabilities.
There also is the issue of perceived benefit. I enjoy the experience of flying more than the experience of riding a motorcycle, so I get more pleasure for the same amount of perceived risk exposure. The greater the benefit, the more willing we are to accept greater risks. That is why, to me, flying seems less risky than riding a motorcycle.
The way we humans assess and manage risk is the subject of thousands of studies in fields ranging from psychology to economics. There are many reasons we diverge from logical thinking. One reasonwe make stupid risk-based decisions comes from a wide variety of cognitive biases, which tend to lead us to flawed heuristics. Heuristics are the experienced-based, knee-jerk mental shortcuts we use instead of actual rational thinking. One of the best ways to avoid these cognitive pitfalls is to know they exist, understand how they work and view them as the song of the sirens luring us onto the rocks.
It is not entirely fair to simply say humans are totally irrational; the psychological theory of bounded rationality cuts us some slack. The bounded rationality concept recognizes that in decision making, our rationality is limited by the information we have, our own cognitive biases, our training and experience,and the finite amount of time we have to make a decision, especially when there are limitations on data gathering. In other words, we do the best we can with the information we have. Since we dont have time to evaluate and rationally pick the very best optimal solution, we simplify our choices and go with one that is satisfactory rather than rationally optimal-this is called statisficing.
A good example of satisficing-bounded rationality might happen when your engine quits. The best place to land might be behind you. Rather than taking the time to do a 360-degree turn to scope out every possible field and analyze the optimal choice, we quickly limit our choices to those in our immediate field of view, perhaps processing wind direction, obstacles and otheractors. The landing area we select may not be the absolute best choice, but the satisficing heuristic allows us to accept good enough. Even this rapid decision-making is subject to cognitive pitfalls. Cognitive anchoring, for example, may lead us to like our first option better than subsequent options.
Interpreting Facts
Our primary training emphasizes a fact-based view of risk management. We look at the hazards and sequentially attempt to mitigate or control them. Preflight planning is the first step that arms us with information we need to make informed rational choices. Knowing the simple facts like wind speeds, runway orientations, frequencies, distances, fuel supplies and burn rates, etc., helps us make better rational decisions.
Before each flight, we inspect aircraft systems to identify physical hazards that could affect safety of flight. Does the engine have oil? Is the air intake clear? Are the ailerons attached? By performing a systematic preflight, we can eliminate known causes of failure and are equipped to make better go/no-go decisions. Presumably a failure on the checklist requires a redo (e.g., add oil) or a no-go decision (wings are missing).
The preflight checklist can look like a simple binary decision-tree with nice simple answers. Enough oil? (yes or no). But it can get murky and subject to judgment very quickly. For example, does the gas have water or contamination in it? Here we begin seeing ambiguity, subject to our cognitive dysfunction.
So what do you do if you find water in your fuel? Do you keep sumping until its gone and continue as planned, or do you abort and schedule a visit with the mechanic? It depends. Do you know why? How much water? When and where was the aircraft last fueled? I would argue that the way you deal with facts and data from the preflight inspection and all of your preflight planning will be skewed by the myriad cognitive biases fighting to win your decision.
Perhaps you are subject to the optimist bias: Im sure I got the water out. Perhaps you have confirmation or expectation bias: I didnt expect to find water, so the sample I am looking at is clearly avgas. I dont need to take another sample. Perhaps you will be persuaded by the normalcy bias: Ive never had a problem with water in the fuel before, so it is not likely a problem this time.
I am not implying you need to ground the plane every time something is outside normal parameters. You do, however, need to step back and recognize your interpretations are skewed by one or more cognitive biases. Your failure to account for the cognitive bias in your interpretation of data may be what kills you as much as the water in the gas.
The Ultimate Checklist Item?
The most important item on my checklist is a two-part question I ask before committing to any flight: What is it about this particular flight that is going to kill me, and have I been intellectually honest in evaluating and mitigating the risks? Because every flight has its own risk profile, the answer is always different. Asking-and answering-this question is my last-chance, failsafe point for checking and rethinking any faulty heuristics and reevaluating my go/no-go decision.
One of my more recent flights involved a 500-foot takeoff roll on an 800-foot backcountry runway. I knew the numbers going in; I practiced a lot of takeoffs and landings to verify I had the performance I would need. I did not take a passenger. On the approach, I aborted the first landing attempt-not because I had a bad approach or was going to land long, but because I seriously was questioning my judgment to land at a place with such tight margins for my subsequent departure the next morning. By landing, I would be committing to the next days takeoff.
Because of my ultimate checklist item, I knew the most dangerous aspect of this particular flight was landing at an airstrip presenting conditions very near the limits of my skills and my aircrafts takeoff performance. On the second landing attempt, I was so busy examining the clearance from trees needed for my departure route the next day that I made a less-than-stellar landing.
The next morning, during my takeoff roll, I fixated on getting the tail up to keep it from bouncing on the rocks (an example of the availability heuristic-I had broken a tailwheel earlier in the year), and I forgot to relax the pressure and allow the plane to accelerate more efficiently in the tail-low position, which resulted in a longer ground roll than I wanted. While the takeoff had the necessary margin to get off the runway and clear the trees, it was tighter than it needed to be.
I got a lesson I will never forget thanks to a bias acquired earlier in the year.
Recipes
Flying has its own particular set of risks, which is perhaps why I am drawn to it. Ongoing training and practice are the most important ingredients in my recipe for risk management, with dubiousness as a key ingredient. I always question the judgment of the person ultimately in charge of my decisions: me. I mix that with some critical self-evaluation, a dash of risk-taking and a pinch of good luck.
Its said we cant make our own good luck. Maybe not, but we can minimize the need for it by being aware of humans innate ability to rationalize away warnings against the outcomes we want. And if we dont push the envelopes edges once in a while, we wont know where they are. Maybe there are old and bold pilots, after all.
Mike Hart is an Idaho-based commercial/IFR pilot with 1000 hours, and proud owner of a 1946 Piper J3 Cub and a Cessna 180. He also is the Idaho liaison to the Recreational Aviation Foundation.
This article appeared in the December 2013 issue of Aviation Safety Magazine.