Scientific American
Automotive
Uber Self-Driving Car Fatality Reveals the Technology’s Blind Spots
The ride-sharing company has halted its autonomous vehicle testing while it investigates the accident in Arizona
By Larry Greenemeier on March 21, 2018
Uber Self-Driving Car Fatality Reveals the Technology's Blind Spots
A pilot model of the Uber self-driving car is displayed at the Uber Advanced Technologies Center on Sept. 13, 2016, in Pittsburgh. The make and model are similar to the vehicle involved in the Tempe, Ariz., accident. Credit: Angelo Merendino Getty Images
Advertisement
A self-driving Uber sport utility vehicle struck and killed a pedestrian in Tempe, Ariz., on Sunday night. Elaine Herzberg, 49, had been pushing a bicycle across a busy road about 100 meters from the closest pedestrian crosswalk when she stepped in front of the vehicle, which was traveling 38 miles per hour in a 35 mile-per-hour zone, Tempe police chief Sylvia Moir told the San Francisco Chronicle. The fatal accident prompted Uber to temporarily halt testing of its driverless vehicles on public roads in Phoenix, Pittsburgh, San Francisco and Toronto.
Herzberg’s death is the first reported incident of a pedestrian killed by a self-driving car, and raises questions about whether such vehicles are ready to operate autonomously on public roads. The vehicle’s cameras and other sensors apparently did not detect the victim and made no attempt to brake or otherwise avoid her. An Uber employee was in the Volvo XC90 SUV acting as a safety operator but told police he did not have time to react to avoid hitting Herzberg
Self-driving cars rely on a combination of sensors and data systems to navigate and avoid obstacles. The vehicles typically include some combination of global positioning systems (GPS), light detection and ranging (LiDAR) sensors, radar, cameras and other equipment to help detect lane markings, bicycles, other vehicles and pedestrians. Each of these systems has particular strengths and weaknesses. “One of the things that we have noticed about accidents involving self-driving cars is that they seem strange from a human perspective; for example, the vehicles do not hit the brakes prior to the collision, which is something most human drivers do,” says Bart Selman, a computer science professor at Cornell University and director of the Intelligent Information Systems Institute. “That’s because the vehicles make decisions based on what their sensors detect. If its sensors don’t detect anything, the vehicle won’t react at all.” That was evident following a fatal accident in May 2016, when a Tesla Sedan S using its driver-assist Autopilot technology failed to brake to avoid hitting a tractor trailer that was making a left turn across its lane, killing the Tesla’s driver.
Advertisement
Given the time the Uber accident occurred—10 P.M. local time—it is possible the vehicle’s cameras did not see the pedestrian, but its LiDAR and radar should not have been affected by the darkness, says Ragunathan Rajkumar, a professor of electrical and computer engineering in Carnegie Mellon University’s CyLab Security and Privacy Institute. “Self-driving vehicles are trained to identify crosswalks and yield to a person crossing a road,” Rajkumar says. (He has helped lead Carnegie Mellon’s efforts to develop autonomous vehicles, including the “Boss” SUV that won the DARPA 2007 Urban Challenge.) “Even in a jaywalking scenario [such as this] the vehicle is still always looking for obstacles in its path,” so its failure to see the pedestrian is puzzling, he notes.
Uber is one of several companies developing self-driving cars, with an eye toward eventually putting fully autonomous vehicles on public roads. Waymo, a subsidiary of Google's parent company, Alphabet, is testing similar technology throughout Arizona (including Tempe) and California as well as in Detroit, Austin and other cities. The main difference is that Waymo has begun testing completely driverless vehicles—without even a human safety operator—in Arizona since October. Tesla CEO Elon Musk also has plans to include driverless technology in his company’s roadsters, which currently have an Autopilot driver-assist feature that still requires human control.
Uber’s policy of having a human operator onboard its self-driving vehicles “is perhaps the most interesting aspect of this particular accident,” says Subbarao Kambhampati, a professor in Arizona State University’s School of Computing, Informatics and Decision Systems Engineering. “Unlike the Tesla’s infelicitously named Autopilot, which is meant to be only a glorified driver’s assistant, the Uber cars were supposed to be fully autonomous. Nevertheless, unlike Waymo’s, Uber self-driving cars here in the [East Valley region near Phoenix] have always had a driver—supposedly ready to intercede in tricky situations.”
Uber has human operators in its vehicles because the technology is not mature enough to be completely driverless, according to Rajkumar. “The human has a very specific safety role to play,” he says. There are lots of unexpected scenarios that can happen on the road that the vehicle’s software is not capable of handling. The vehicle drives itself in many situations but not in all situations. A human driver is most likely to take over when the road conditions are bad—such as when ice or snow hides the lane markers—or in downtown urban areas where there are lots of taxis weaving through traffic and people jaywalking.
A human operator must proactively assume driving responsibilities from the car before it encounters a scenario that it is unable to safely negotiate, says Gary Marchant, a professor of emerging technologies, law and ethics at Arizona State. But it is unlikely that person would be able to prevent a crash by taking over for a self-driving system at the last minute because most accidents are unexpected and happen suddenly, Marchant says.
Advertisement
Uber released a statement on March 19 via the company’s Twitter account saying, “Our hearts go out to the victim’s family. We’re fully cooperating with @TempePolice and local authorities as they investigate this incident.” The National Transportation Safety Board also announced Monday it is sending a team to Tempe to investigate the collision. “The investigation will address the vehicle’s interaction with the environment, other vehicles and vulnerable road users such as pedestrians and bicyclists,” according to an agency statement.
“In some ways, a self-driving fatality was to be expected sooner or later, and it [was] only a matter of time,” Kambhampati says. “Arizona [Gov. Doug Ducey] has been quite deliberately courting self-driving car companies, and the Phoenix metro area has become the proving ground for this technology, with both Uber and Waymo operating in our suburbs.”
Rajkumar agrees such an accident was imminent, but also acknowledges road testing is the only way for self-driving technology to truly improve. “A good amount of testing can happen in computer simulation, but that goes only so far because it can’t be entirely faithful to the unpredictable situations the vehicle will face in the real world,” he says. “This [accident] is the scenario that people working on the self-driving vehicle technology have always dreaded. But the big picture is that the technology has the potential to save many lives over time by reducing the number of crashes, injuries and fatalities. We’re going through a transition period, and to me the transition time is the tough part.”
Rights & Permissions
ABOUT THE AUTHOR(S)
Larry Greenemeier
Larry Greenemeier is the associate editor of technology for Scientific American, covering a variety of tech-related topics, including biotech, computers, military tech, nanotech and robots.
Credit: Nick Higgins
Recent Articles
Intelligent to a Fault: When AI Screws Up, You Might Still Be to Blame
You Can't Handle the Truth--at Least on Twitter
Russia's New Nukes Are Similar to a Risky Project the U.S. Abandoned
Read This Next
Automotive
Driverless Cars Will Face Moral Dilemmas
Driverless Cars Will Face Moral Dilemmas
June 23, 2016 — Larry Greenemeier
Automotive
Uber's
Uber's "Self-Driving" Test Cars to Be Overseen by Driver and Engineer
August 25, 2016 — Jennifer Maselli
Automotive
When It Comes to Safety, Autonomous Cars Are Still
When It Comes to Safety, Autonomous Cars Are Still "Teen Drivers"
January 18, 2017 — Jeremy Hsu
Automotive
Obama Guidelines Aim to Get More Self-Driving Cars on the Road
Obama Guidelines Aim to Get More Self-Driving Cars on the Road
September 21, 2016 — Larry Greenemeier
Automotive
Deadly Tesla Crash Exposes Confusion over Automated Driving
Deadly Tesla Crash Exposes Confusion over Automated Driving
July 8, 2016 — Larry Greenemeier
Newsletter
Sign Up
Every Issue. Every Year. 1845 - Present
Neuroscience. Evolution. Health. Chemistry. Physics. Technology.
Subscribe Now!Every Issue. Every Year. 1845 - Present
Follow us
instagramyoutubetwitterfacebookrss
Store
About
Press Room
More
Scientific American is part of Springer Nature, which owns or has commercial relations with thousands of scientific publications (many of them can be found at www.springernature.com/us). Scientific American maintains a strict policy of editorial independence in reporting developments in science to our readers.
© 2018 Scientific American, a Division of Nature America, Inc.
All Rights Reserved.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment