Autopilots: Can Cars Learn From Airplanes?

0

That we live in interesting times is made ever more evident by an email I got from reader Matthew Sawhill following last week’s fatal crash of a Tesla Model S being operated on the car’s much-touted autopilot feature. Conceding that cars aren’t airplanes, Sawhill asked: “Why can’t the auto industry and mass media learn from (and set expectations based on) 70 years of experience with autopilot technology in aviation?”

It’s a fair question and illuminates a trend that seems ever more obvious to me: Automation and autonomy are steadily blurring the lines between not just transportation modes, but everything in modern life. In transportation, it may matter less if it’s a car, an airplane, a bus, a boat or a train than it does that it gets from A to B largely without human intervention. To think that we’re not going there—and probably sooner than many admit—is to occupy the last besieged island of delusional Luddites.

Consider the question, though. What could automakers learn from the aviation experience? First, that no matter how clever, cautious and foresightful they are, they’ll get things wrong and (a) the machine will malfunction, (b) homo the sap will figure out a way to defeat what multiple layers of safety interlocks the smart kids have provided, (c) the human-machine interface will create errors and (d) people will die as a result. How many billions of pixels have we devoted to the benefits of aircraft automation being a delicate balance against the human’s mental acuity and hand-eye coordination atrophying to the point of uselessness? Full automation will likely eliminate that and perhaps reduce fatalities markedly, but it will introduce new faults no one thought of.

The Tesla crash is an example of this. In case you weren’t paying attention, the Tesla was on autopilot clipping down a divided four-lane, non-limited-access highway. A tractor trailer made a left-hand turn in front of the car, crossing one of the many cutovers this type of highway typically has. The driver was enough out of the loop not to have noticed and the sedan went under the trailer, killing him. The driver was an aficionado of the Tesla in general and of its autopilot features specifically and had posted several videos like this one touting the system’s capabilities.

If he understood that Tesla had plainly said this technology is not intended to be hands-off/eyes-off autonomy, it seems reasonable to conclude that he didn’t act on this understanding. (News reports claimed he may have been watching a DVD, but that’s almost a distractor to the larger issue of believing the Tesla system has more capability than it does.)

One thing automotive manufacturers will not be able to do is build systems as relatively simple as aircraft autopilots are. You read that right. That’s because autonomous driving is a far more complex problem than autonomous flying. The proximities are much closer, there are many more vehicles and many more variables in the form of road standards, obstacles, weather effects, mixing with non-autonomous vehicles and sheer human unpredictability and there are no enforced separation standards, as we have in aviation. We may reach this kind of chaos in aviation if drones achieve the swarming level, which some think they will. But we’re not there yet.

While airplanes have the additional challenge of the third dimension, this turns out to be an advantage: Cars can’t do vertical separation, or at least in a way that will render them useful after cheating the collision. Fully autonomous aircraft autopilots aren’t in their infancy. GPS, INS, GPWS, TAWs, autothrottles, smart servos and other technologies have been around for quite some time and we know how to use them. As we reported,Diamond recently demonstrated a fully autonomous flight of one of its twins and as soon as they work out the details of electric braking, the airplanes will be capable of chock-to-chock flights with little or no human intervention. It then becomes simply a matter of filling in the regulatory boxes and seeing if anyone wants to buy such a thing. (Some certainly will.)

Against the backdrop of that and, say, a 777 being capable of routine autolandings, consider the Tesla’s challenge. Rather than the relatively trivial task of tracking an electronic glideslope—which we’ve been doing for 50 years—and judging closure rate with radar altimetry to a flat, obstacle-free, internationally approved strip of concrete, the Tesla uses a combination of cameras, ultrasonic sensors and short-range radar to detect and avoid obstacles in a highly non-standard environment. News reports indicated that the ambient light was such that the camera couldn’t distinguish the light truck body from a light background. (The human eye, of course, probably could have.) Elon Musk was quoted as saying the forward-looking radar is programmed to recognize and not react to overpasses and may have interpreted the truck as just that.

There’s another thing the automation can’t do that the driver could have, had he been engaged: exercise intuition. The accident scenario was a classic left-turn threat that motorcyclists know all about. If you’re paying attention, you see the potential many seconds before it has time to develop; you analyze it and respond by slowing or maneuvering. It’s an OODA loop. A car autopilot can probably be taught to do some form of this, but will it ever be able to distinguish a parked vehicle from one that’s about to dart across a lane just by the look of it? Maybe. But it will be a while.

There are dozens of such scenarios. Another one is at an intersection near my house. It’s a complex intersection with momentary restricted right on red, conveyed through a sign with an arrow that illuminates for just 10 seconds of the turn cycle. Could a Tesla autopilot deal with this? Whether it can or it can’t is less relevant than realizing an airplane doesn’t have to. Even if it auto taxis to the ramp, it can already do that via WAAS-GPS. A few optical sensors might improve the process.

That a Tesla owner would dismiss the risk of running on full autopilot while snoozing is perhaps understandable, given how well the system apparently functions and that there’s a complete lack of risk data on Tesla autopilot accidents because this is the first one. According to the National Safety Council, the fatal accident rate in the U.S. is 1.08/100 million miles driven. If Tesla’s data is correct, the Florida accident was the first fatal in 130 million miles of autopilot operation. A single data point does not a risk matrix make, but at least the autopilot can claim a slightly better record.

What all this means, I think, is that automakers can expect unexpected fatalities with automation as a factor just as airplane manufacturers have experienced. It’s just inevitable. And while the bright new world of autonomy may eventually drive fatalities dramatically downward, I doubt if anyone reading this will see the end of them in his or her lifetime. The immediate lesson for drivers of semi-autonomous cars is this: Stay in the loop, with eyes on the road, or risk making the last bad judgment of your life.

LEAVE A REPLY