Autopilots: Can Cars Learn From Airplanes?

  • E-Mail this Article
  • View Printable Article
  • Text size:

    • A
    • A
    • A

That we live in interesting times is made ever more evident by an email I got from reader Matthew Sawhill following last week’s fatal crash of a Tesla Model S being operated on the car’s much-touted autopilot feature. Conceding that cars aren’t airplanes, Sawhill asked: “Why can't the auto industry and mass media learn from (and set expectations based on) 70 years of experience with autopilot technology in aviation?”

It’s a fair question and illuminates a trend that seems ever more obvious to me: Automation and autonomy are steadily blurring the lines between not just transportation modes, but everything in modern life. In transportation, it may matter less if it’s a car, an airplane, a bus, a boat or a train than it does that it gets from A to B largely without human intervention. To think that we’re not going there—and probably sooner than many admit—is to occupy the last besieged island of delusional Luddites.

Consider the question, though. What could automakers learn from the aviation experience? First, that no matter how clever, cautious and foresightful they are, they’ll get things wrong and (a) the machine will malfunction, (b) homo the sap will figure out a way to defeat what multiple layers of safety interlocks the smart kids have provided, (c) the human-machine interface will create errors and (d) people will die as a result. How many billions of pixels have we devoted to the benefits of aircraft automation being a delicate balance against the human’s mental acuity and hand-eye coordination atrophying to the point of uselessness? Full automation will likely eliminate that and perhaps reduce fatalities markedly, but it will introduce new faults no one thought of.

The Tesla crash is an example of this. In case you weren’t paying attention, the Tesla was on autopilot clipping down a divided four-lane, non-limited-access highway. A tractor trailer made a left-hand turn in front of the car, crossing one of the many cutovers this type of highway typically has. The driver was enough out of the loop not to have noticed and the sedan went under the trailer, killing him. The driver was an aficionado of the Tesla in general and of its autopilot features specifically and had posted several videos like this one touting the system's capabilities.

If he understood that Tesla had plainly said this technology is not intended to be hands-off/eyes-off autonomy, it seems reasonable to conclude that he didn’t act on this understanding. (News reports claimed he may have been watching a DVD, but that’s almost a distractor to the larger issue of believing the Tesla system has more capability than it does.)

One thing automotive manufacturers will not be able to do is build systems as relatively simple as aircraft autopilots are. You read that right. That’s because autonomous driving is a far more complex problem than autonomous flying. The proximities are much closer, there are many more vehicles and many more variables in the form of road standards, obstacles, weather effects, mixing with non-autonomous vehicles and sheer human unpredictability and there are no enforced separation standards, as we have in aviation. We may reach this kind of chaos in aviation if drones achieve the swarming level, which some think they will. But we're not there yet.

While airplanes have the additional challenge of the third dimension, this turns out to be an advantage: Cars can’t do vertical separation, or at least in a way that will render them useful after cheating the collision. Fully autonomous aircraft autopilots aren’t in their infancy. GPS, INS, GPWS, TAWs, autothrottles, smart servos and other technologies have been around for quite some time and we know how to use them. As we reported, Diamond recently demonstrated a fully autonomous flight of one of its twins and as soon as they work out the details of electric braking, the airplanes will be capable of chock-to-chock flights with little or no human intervention. It then becomes simply a matter of filling in the regulatory boxes and seeing if anyone wants to buy such a thing. (Some certainly will.)

Against the backdrop of that and, say, a 777 being capable of routine autolandings, consider the Tesla’s challenge. Rather than the relatively trivial task of tracking an electronic glideslope—which we’ve been doing for 50 years—and judging closure rate with radar altimetry to a flat, obstacle-free, internationally approved strip of concrete, the Tesla uses a combination of cameras, ultrasonic sensors and short-range radar to detect and avoid obstacles in a highly non-standard environment. News reports indicated that the ambient light was such that the camera couldn’t distinguish the light truck body from a light background. (The human eye, of course, probably could have.) Elon Musk was quoted as saying the forward-looking radar is programmed to recognize and not react to overpasses and may have interpreted the truck as just that.

There’s another thing the automation can’t do that the driver could have, had he been engaged: exercise intuition. The accident scenario was a classic left-turn threat that motorcyclists know all about. If you’re paying attention, you see the potential many seconds before it has time to develop; you analyze it and respond by slowing or maneuvering. It’s an OODA loop. A car autopilot can probably be taught to do some form of this, but will it ever be able to distinguish a parked vehicle from one that’s about to dart across a lane just by the look of it? Maybe. But it will be a while.

There are dozens of such scenarios. Another one is at an intersection near my house. It’s a complex intersection with momentary restricted right on red, conveyed through a sign with an arrow that illuminates for just 10 seconds of the turn cycle. Could a Tesla autopilot deal with this? Whether it can or it can’t is less relevant than realizing an airplane doesn’t have to. Even if it auto taxis to the ramp, it can already do that via WAAS-GPS. A few optical sensors might improve the process.

That a Tesla owner would dismiss the risk of running on full autopilot while snoozing is perhaps understandable, given how well the system apparently functions and that there’s a complete lack of risk data on Tesla autopilot accidents because this is the first one. According to the National Safety Council, the fatal accident rate in the U.S. is 1.08/100 million miles driven. If Tesla’s data is correct, the Florida accident was the first fatal in 130 million miles of autopilot operation. A single data point does not a risk matrix make, but at least the autopilot can claim a slightly better record.

What all this means, I think, is that automakers can expect unexpected fatalities with automation as a factor just as airplane manufacturers have experienced. It’s just inevitable. And while the bright new world of autonomy may eventually drive fatalities dramatically downward, I doubt if anyone reading this will see the end of them in his or her lifetime. The immediate lesson for drivers of semi-autonomous cars is this: Stay in the loop, with eyes on the road, or risk making the last bad judgment of your life.

Comments (13)

My gut sense is that autonomous cars will become accepted only when they behave like a chauffeur - fully autonomous, with no input required by the occupant.

In that sense, Google is on the right path by not even providing a steering wheel.

However, not every trip results in picking a destination and pressing the "Go" button., Like a chauffeur, they would have to be able to respond to random changes, such as the occupant saying, "hey, look, that's my buddy - pull over so I can talk to him!" Easy for a human chauffeur, but for a computer? Look? Look where? Which human is 'your' buddy?

It reminds me of the scene the movie "Total Recall" in which Schwarzenegger, trying to escape from the bad guys, hops into a computerized "Johnny Cab". The android asks, "Where to?" The hero replies "Drive!"
"Please state a destination."
Hero: "Anywhere! Just go!" as he frantically looks over his shoulder and the rapidly approaching villains..
...and so on, until the hero climbs into the driver's seat and drives himself.

I'm not so arrogant as to believe I've thought of this first. I'm sure it's being considered in think tanks developing these vehicles. How well it's implemented will determine how well autonomous cars are accepted.

They are coming, but I think they're a bit further away than the evangelists would have us believe. Those last few percentage points of complete autonomy are the hardest and slowest to implement. It's like the old joke about homebuilts - 90% complete, 90% to go.

Posted by: KIRK WENNERSTROM | July 4, 2016 10:04 AM    Report this comment

"...cars can't do vertical separation, or at least in a way that will render them useful after cheating the collision"

Ha ha! Hey Paul, check out the opening scene of the "Dukes of Hazzard" where the General Lee climbs to FL 0.20 before continuing to evade Sherriff Roscoe! I read they trashed some 40 Dodge Chargers in making that series. A few of these have actually been recovered and made roadworthy again.

But seriously, back to your thesis. Another thing that must be faced with this issue: is "blaming" a piece of impersonal software for the death of a loved one going to be any different from blaming a fallible human being? Will that be more difficult to reconcile personally or legally? I hope I never have to find out (our any of us for that matter). Indeed we live in interesting times. Happy Independence Day.

Posted by: A Richie | July 4, 2016 11:54 AM    Report this comment

Fortunately, the legal profession is here to save us from ourselves. They already smell the blood in the shark infested waters and are just waiting for the opportunity sue the cr&#P out of every manufacturer. This will drive the costs into the flight levels.

Posted by: Leo LeBoeuf | July 4, 2016 12:03 PM    Report this comment

Back in high school I had a Corvair convertible. Whenever Ming (friend's nickname) was about to throw up, we sensed it and knew when to pull over. Saved our noses and my car many times. Just sayin'.

Long live recreational aviation, manual transmissions, bicycles, and walking. Maybe island life might not be too bad...

Posted by: Dave Miller | July 4, 2016 4:03 PM    Report this comment

After some thousands of hours driving a non-autopilot equipped airplane around the skies I finally have a bird with a decent autopilot, and while I feel no desire to go back to all-manual piloting, the trap which closed on poor Mr. Brown is all too obvious to me. In droning along cross-country with hands in the lap & feet on the floor I often realize that although I may be nominally staring out the windows, I'm really not LOOKING.

The visual stimulation level experienced during road travel obviously beats staring at empty sky for 3 or 4 hours, but if a driver's continued experience with the automation produces the same level of confidence as an airplane autopilot it won't be long until the his/her brain likewise drifts into neutral. I would suspect that Mr. Brown's relatively long and confidence-building prior experience with the Tesla's automation led him to conclude cruising down a divided highway, even though not a limited access one, was not a task requiring close personal attention.

Posted by: John Wilson | July 4, 2016 4:41 PM    Report this comment

I look at an autopilot in a light airplane sort of like a cruise control in a car. But I'm really excited about the prospect of autonomous cars because of what they'll mean to family members who can't drive for medical reasons. Imagine if you woke up tomorrow in a hospital and a doctor told you that you'd probably never drive again. Never mind flying; you'd be dependent on someone to drive you *everywhere*.

Posted by: Robert Gatlin-Martin | July 4, 2016 7:58 PM    Report this comment

One question that as not been asked, could the driver have saved himself if he was in the loop and driving the car? Or if he was paying attention, could he gained control fast enough and with enough awareness to avoid the accident. Tesla may be able to answer these questions with the sensor data from the auto. Having driving a Tesla with the auto drive system and several aircraft with autopilots, I don't believe there is a comparison. I also don't think that there is much the auto industry can learn from aircraft autopilots. The environment for an auto drive system and an auto pilot is very different. An auto drive system must deal with huge amounts of non standard data created by many random events, especially other drivers who don't follow the rules. An autopilot on the other hand, is a very straight forward programming task due standardization. It's an interesting fact that removing all drivers from the roads would make the auto driving cars much safer and easier to program and closer to the auto pilots.

I found the Tesla auto drive system to be excellent. It "saw" things in California traffic that I didn't "see". When riding as a passenger, I felt much safer when the auto drive system was on. It was a better driver than my brother-in-law.

The huge change here is that a Tesla is upgrade-able. Unlike all other autos which degrade over time, a Tesla continues to be get better, just like a mobile phone. When I was in CA driving the Tesla, it went though a major upgrade adding "summons" mode. Summons mode enables a Tesla to back out of and into a garage, without hitting the mirrors (as my brother-in-law did the day after summons mode was enabled). He uses summons mode all the time now.

Posted by: DANA NICKERSON | July 4, 2016 8:59 PM    Report this comment

In a video interview with the legendary Gen. Jimmy Doolittle taken not long before he died, he said that blind flying (instrument flying as we call it today) was quite safe as long as you are the only airplane in the sky. But the moment a second airplane takes off, you have a real problem.

I think we may be approaching a similar conundrum on the ground; it began with cellphone distracted drivers, is in full bloom today with texting drivers, and now we are beginning to see fully heads-down and eyes-closed drivers. Can ground-based ATC be far behind?

Posted by: A Richie | July 5, 2016 9:30 AM    Report this comment

Good article Paul. One thing that makes aircraft autopilots much easier is that they don't have to "see" anything around them. They derive their sense of position by GPS, temperature, air pressure, etc. and compare that to a map imbedded into their memory. Even if they have the capability to sense surrounding traffic, it is through an electronic signal transmitted by the other aircraft. By comparison, much of what a car needs to process is both electronic (radar & ultrasonic) as well as optically. Computers have the power to map their GPS position in relation to the memory map, but the optics of recognizing what they see and processing its significance is not ready for prime time yet. How long that will take is anyone's guess.

As you said, true driving automation is probably much farther off than the proponents would have us believe. In addition to the cost of developing the cars, it will require billions of dollars in road improvements that neither the states nor the Feds currently can afford. Even if and when it happens, the wise driver, like the prudent pilot, should pay attention to their surroundings. As Air France discovered, when a confused autopilot disconnects and says "it's your airplane", bad things can happen if you aren't ready to take control.

Posted by: John McNamee | July 5, 2016 11:02 AM    Report this comment

Once again, autonomy gets bad press because of the shortcomings of what some oxymoronically characterize as "semi-autonomous" systems. News Flash: ain't no such thing.

Posted by: Tom Yarsley | July 5, 2016 11:56 AM    Report this comment

Perhaps all new vehicles should be equipped with a short-range transponder-like device. That would allow a road-based equivalent of TCAS to be possible, and while we're at it we can add roadside detectors for real-time traffic data.

Of course, taking it a step further, we should require them to be GPS/WAAS-based transponders, and mandate that all existing vehicles have one installed by some arbitrary date...say, January 1st 2020, or else they can't drive on limited-access highways or 4-lane divided highways. And these transponders would have unique identifiers...

Posted by: Gary Baluha | July 5, 2016 12:22 PM    Report this comment

It seems that humans have created another conundrum that autonomous lawyers will be litigating for centuries to come. I just hope there will be enough autonomous taxpayers to fund everything.

Posted by: Richard Montague | July 5, 2016 2:03 PM    Report this comment

Here's a though: what will "autopilots" in cars mean for future litigation in aircraft accidents? We normalize general behaviors based on experience. So if I am accustomed to using a car autopilot (with its inherent pitfalls) and I'm a juror on a case about an airplane accident with an autopilot, am I more or less likely to find fault with either the pilot or the manufacturer? Do I internally think "that pilot must have been dumb, he doesn't know how autopilots work," or, do I think, "this plane maker is as bad as my car manufacturer, they need to be taught a lesson."

Posted by: JEFFREY SMITH | July 6, 2016 4:16 PM    Report this comment

Add your comments

Log In

You must be logged in to comment

Forgot password?

Register

Enter your information below to begin your FREE registration