Syn Vision For Approaches
The other evening I was returning from a flight to the east coast of Florida in an aircraft Sarasota Avionics kindly loaned me for the day. I landed just past dusk when, rare for Florida, the temperature and dewpoint came together and as I was tying down, the fog was forming. We occasionally get marine layers off the Gulf, too, but if that happens four times a year, I’d be surprised.
The best the approaches into Venice can do is 360-foot MDAs and a mile of required visibility. It’s unlikely they’d be of much help in dense fog. For the past three days by early evening, the ASOS had been reporting 200 feet and a ¼ mile, if not indefinite. I’m glad I made it in before it got worse. “Well,” my friend Dan mentioned half seriously, “you’ve always got the synthetic vision.”
True enough. The airplane has a pair of Aspen EFD1000s, with synthetic vision. My initial reaction was that I’d never use syn vision to land or carry on below MDA, but after I thought about that, I asked myself why. The kneejerk answer is that FAR 91.175 is so baked into to me and as an instrument instructor I’ve taught its thou-shalt-not religion for so long that I don’t know anything else. In case you’re rusty on your FARs, 91.175 requires the runway or its environment to be in sight before descending below MDA or DH.
There are technical reasons the regulation is worded the way it is and it isn’t based on some bureaucratic capriciousness. Instrument approaches are designed to be and are 100 percent guaranteed safe if you fly them exactly the way they’re charted and if the equipment you’re flying is functioning correctly. The obstacle clearance is carefully considered with margins for system and human error. The FAA routinely flight checks procedures looking for anomalies. But once you depart the black lines, you’re on your own.
To find a runway out of the clag, however, you necessarily have to fly down a funnel to the minimum obstacle clearance at which point you’re supposed to be able to see the runway visually and land safely. For a standard CAT I ILS, that’s only 200 feet and some GPS approaches offer MDAs nearly as low as that. Two hundred feet isn’t much, which is why instrument pilots who hope to survive to a long and rewarding career best not make a habit of busting DA/DHs and MDAs.
On conventional instrument approaches, the actual statistical risk of busting descents is one of those ineffables. Where and how you do it and in what conditions drives how risky it really is. It’s one thing to burn 50 feet on a needles-centered CAT I ILS, but another to dive out the clouds from a GPS MDA with nothing showing but faith and hope. Yet pilots do it, although I suspect not routinely.
I was once at a Connecticut airport standing outside the FBO shack in dense fog with an instrument student. We were waiting to get at least a half mile to depart on a training flight. We heard one of the local jet operators check in, inbound on the ILS. I told my student he’d get to at least hear a genuine missed approach. As we stood outside on the sodden gray ramp, waiting to hear the roar of spooling engines, we heard instead the unmistakable double squeak of tires on pavement. The lineman shrugged. No way they had the required minimum vis of a mile. It was RVR weather. At least it was an empty leg.
When staring at the digital glory of synthetic vision on a PFD, the temptation to do this routinely must be overwhelming. In really low weather, the runway may not be visible out the windshield, but it’s bigger than hell right there on the PFD. What’s the risk of just going for it if you don’t see anything at DA/MDA? Probably not that much, although I doubt if there’s any meaningful data to put a number on it. Synthetic vision doesn’t depict close-in obstacles off the end of runways, but a runway with an instrument approach isn’t likely to have meaningful close-in obstacles, at least from the missed approach forward to the threshold of the runway. But this varies by approach.
I’ve flown dozens of visual approaches in airplanes with synthetic vision and never noticed one in which the runway wasn’t where it was supposed to be or was misaligned enough to detect, but I also wasn’t looking for apparent misalignment. Even the synthetic centerline matches the real world view, which is impressive when you consider that synthetic vision is doing that with WAAS GPS matched to a terrain database. We’ve come to take this level of performance for granted without thinking about how well it works.
I haven’t tried landing under the hood with just synthetic vision because it doesn’t provide the necessary depth perception clues. But in low vis, you could certainly use it to get close enough to acquire the runway visually then land normally. I’d guess that would work in quarter-mile visibility or a little less. Are people actually doing it, despite the lack of legal framework, to bend FAR 91.175? No one I know has confessed, but human nature being human nature, I’d be surprised if someone hasn’t done it and if some aren’t doing it routinely—homegrown CAT II.
I’d like to see some system reliability and accuracy data on synthetic vision, but I think it’s time to allow its use for descent and visibility credit for not-for-hire operations. Sure, there’s risk in doing so, but let pilots assume it if they want. The fact that we’re not doing it already probably has more to do with legal inertia than technical considerations. I asked Garmin if they had synthetic vision in mind for this kind of upgrade and they declined to comment. I take that as a yes.
For years, the transport industry has used head-up displays to qualify crews for lower landing minimums or at least to smooth the transition from the gauges to the visual. I flew on in an Alaska Airlines simulator 20 years ago. I’ve poked around to find some data on how widespread HUDs are in airline use, but I can’t get an accurate sense of market penetration, but it’s probably the majority of the fleet. On a recent Southwest flight, the captain told me the entire fleet is HUD equipped and he liked it and used it on every approach, including visuals.
Periodically, we see attempts to offer HUDs for light GA aircraft but they never seem to gain traction. We reported on the latest offering in this video. MyGoFlight says it can integrate synthetic vision, but I’m not sure what that would look like.
HUDs have proven expensive and cumbersome, requiring a combiner-type display to be positioned in front of the pilot’s eyes only when needed. The military is way ahead on this technology and now projects it in a helmet display or, before that, on a mounted combiner that’s permanently in view in the pilot’s sightline out the canopy.
And for GA, how often would a HUD matter? How many pilots fly weather that requires bitter-end transition from the gauges to the visual and how much good would a HUD really do?
Not much, I ‘d guess. Or maybe not enough to justify spending $10,000 to add it. This is a rarified risk area in which there are too few accidents to make much sense of the probabilities. In the aviation press, we write evergreens with tips about how to transition from the gauges to the visual as though it’s fraught with hazards and tricks. It isn’t. Just look out the window.
The Southwest skipper told me he saw benefits in using it for visuals because it gave good windshear cures and made staying on speed easier. Fair enough. But little airplanes are still hand-eye machines and that’s why a lot of us still fly them. Do we really need or even benefit from so much technology just to land a Bonanza? Would it help?
If the FAA ever approves synthetic vision for approach mins credit, I think it would be festooned with so many system and certification requirements that it would be impossible to certify. But then at one time, I thought that about GPS approaches and non-TSO’d gyros too. Maybe the very act of my thinking this will cause it to become true. We can only wish.