Moving Automated Driving To the Next Level
If there had been a competition for world-class back-seat drivers, my grandmother would have won it hands down. Back in the 1980s when we were living in Massachusetts, we drove to Boston's Logan Airport and picked her up for a visit. Despite never having been closer to New England than Ohio in her entire life, she immediately started telling me which turns to take in downtown Boston as soon as I got lost, which I always did anyway, but without her help. We made it home, but not without lots of needless distraction.
Developers of what the Society of Automotive Engineers (SAE) calls "automated driving" are facing the opposite problem of back-seat driving in trying to get people in the front seat to pay attention to the road while a robot does what the SAE calls "Level 3" automatic driving.
In a recent New York Times piece, tech writer John Markoff highlights the problems that arise when autonomous vehicles are not yet capable of 100% "hands-off" operation. Two or three years ago, the National Highway Traffic Safety Administration (NHTSA) and SAE concurred on a classification scheme for automated driving systems. What most people do now in older cars when they do all the driving themselves is Level 0 (no automation). Level 5 is a system that could adapt to any and all driving conditions with no input whatsoever from the driver, who could therefore safely sleep or do crossword puzzles for the whole trip. No one has yet been able to field a Level 5 system, but the standard assumes we will eventually get there. In between, there are vehicles such as Tesla cars equipped with an autopilot system (Level 2), and the latest self-driving cars now being fielded by Google's autonomous-car spinoff Waymo (Level 4). But even Level 4 cars can't cope with all situations, and when a driver starts to treat a Level 2 system like it was Level 5, trouble lies ahead.
The worst example so far of driver inattention while riding in a partially autonomous vehicle happened in 2016, when a Tesla Model S in Florida failed to detect a semi that suddenly crossed the vehicle's path. Despite the fact that Tesla warns the driver that he or she must be prepared to take evasive action in such situations, he was apparently watching a video, which was the last thing he saw, let us say. This fatal accident was the first such mishap in Tesla vehicles, which have since been modified to test the driver's attention periodically. And if the driver isn't paying consistent attention, the car will terminate the autopilot feature for the rest of the trip, forcing the driver to go back to work.
This is just one specific example of a general problem with partially autonomous vehicles—say Levels 2 through 4. They all require the driver to be prepared to regain control of the vehicle in an emergency or other situation that the robot driver can't cope with. But as Markoff points out, going from sheer inattention to fully capable operation of a motor vehicle in a few seconds is not something people do particularly well.
Studies have shown that even with those who are mentally prepared for the transition, it can take as long as five seconds to adjust to the feel of the steering at a particular speed and get to the point where the driver is truly in control and capable of dealing with problems. Five seconds can be a longer time than you have—a car traveling at 70 MPH will move over 500 feet (156 meters) in five seconds. If the potential problem is only 200 feet away, by the time you're ready to act it may well be too late.
Those wanting to deploy cars with more and more autonomous features face a chicken-and-egg problem. Everybody admits that as of today, there is no system in which it is completely safe for the driver to act like he or she is at home in bed. But to get to that point, we have to gain experience with less-than-perfect systems, which all require the human driver's input at some point. The issue then becomes how to accustom drivers to this wholly new mode of "driving." And people being people, they are not always going to follow instructions. The man who lost his life in the Tesla accident was told to keep his hands on the steering wheel at all times. But he'd found that nothing bad happened most of the time he didn't, and so would many others unless the system enforces attention in some way, which it now apparently does.
As for me, I may be fairly typical in that I am not interested in automated driving systems until I can trust them at least as well as I can trust my wife to drive—if not better. We may be encountering a different form of what in aesthetics is known as the "uncanny valley." Humanoid robots that look like classical robots—hardware sticking out from their metal chests and so on—don't bother us particularly. And a humanoid robot that is such a good imitation of a human that you can't tell the difference between the robot and a real human presumably wouldn't bother us too much either. But students of robotics have found that "human-like" robots that are close to real humans, but not close enough, give people the creeps. And it will give me the creeps, or worse, if I sit behind the wheel without steering unless told to do so by a machine.
If I was sort of driving and sort of not driving a car that was doing things in traffic that I couldn't predict, and I was constantly hoping I wouldn't have to intervene but always wondering if something was about to happen that would require me to grab the wheel—well, I might as well quit my job and start teaching driver's education at Nelson's Driving School for the Chronically Nervous. Back when high schools were obliged to teach driver's ed, you would learn in a car equipped with two brake pedals, one on the passenger's side where the instructor sat. My instructor got to use her pedal more than once, and I can now only imagine what torment she went through while she watched me move jerkily through traffic. If I was riding in anything less than a Level 5 autonomous vehicle, I'd be in the same position as my unfortunate driving instructor—all the time it was moving.
The prospects for autonomous driving hinge critically on how manufacturers and developers will handle the next five or so years, before truly autonomous (Level 5) driving is possible. It may be the wisest thing to continue mainly with experiments until automakers can say with reasonable confidence and safety what the bus companies have been saying all along: "Leave the driving to us."
Sources: John Markoff's article "Robot Cars Can't Count on Us in an Emergency" appeared on the New York Times website on June 7, 2017 at https://www.nytimes.com/2017/06/07/technology/google-self-driving-cars-handoff-problem.html. It has a reference to a summary of the SAE Standard J3016 for the classification system of automated driving, at https://www.sae.org/misc/pdfs/automated_driving.pdf. I also referred to Wikipedia articles on Waymo, the history of autonomous cars, and the uncanny valley.
Comments
Post a Comment