Although Tesla, an electric car maker, recently announced that it would recall almost all vehicles sold in the United States to fix defects in its driving assistance feature, Autopilot, some criticize that it is insufficient to solve Autopilot’s safety problems.
According to the Washington Post (WP) on the 17th (local time), experts point out that Tesla’s recall action will not solve the fundamental problem of Autopilot.
When the National Highway Traffic Safety Administration (NHTSA) earlier pointed out that there were not enough devices in place to prevent possible misuse of the feature after investigating auto pilot-related traffic accidents, Tesla announced a recall of 2 million units on the 13th, saying it would proceed with a software update that added a warning function.
However, experts say that measures to modify the Autopilot feature should not be limited to more warning drivers, but stronger regulations, such as limiting the area itself where the feature can be turned on.
Matthew Wansley, a professor at New York’s Cardozo School of Law who specializes in automobile technology, said autopilot should not be used on cross-traffic roads, adding, “This problem is why many crashes occur at intersections.”
Jennifer Homendy, chairman of the U.S. National Transportation Safety Board (NTSB), who has taken a more critical stance on Tesla’s self-driving technology than NHTSA, also said, “I welcome them to do something,” but pointed out, “We need to check whether such changes are being made properly, so how can we check if we voluntarily recall them?”
Earlier in 2017, the NTSB investigated several deaths related to Tesla’s Autopilot and recommended that the function be disabled except for basic situations designed as conditions for use of Autopilot. Federal Senator Richard Blumenthal, a Democrat, criticized Tesla’s move, saying, “It’s far from enough,” adding, “It’s really problematic to rely on Tesla’s own enforcement.”
“When a car hits an obstacle or another vehicle or goes off the road, there must be more measures than voluntary compliance (from the company),” he stressed.
According to the WP, Autopilot deaths are currently occurring frequently when operating outside specific locations and situations (ODD) designed by default.
Meanwhile, according to the Associated Press, a Los Angeles (LA) court on the 13th ordered the payment of $23,000 (about 30 million won) in compensation to Kevin Riad, a Tesla driver believed to have been criminally charged for the first time in an Autopilot-related death.
Riad was driving his Tesla Model S on an autopilot-operated road in California in December 2019 when he crashed into a Honda Civic at an intersection shortly after exiting the highway, killing two passengers. At the time, the Tesla vehicle ignored the stop signal at the intersection and drove at a speed of more than 110 km per hour.
Riad pleaded not guilty after being criminally charged, but was sentenced to probation in June for a conviction and ordered to pay compensation this time. A civil lawsuit filed by the victim’s bereaved family is underway separately.
“This tragic accident could have been prevented if Tesla’s recently announced recall had restricted the use of Autopilot to controlled highways,” Donald Slavic, a lawyer representing the bereaved, said in a statement.
SOPHIA KIM
US ASIA JOURNAL