U.S. launches formal probe into Tesla Autopilot crashes involving emergency vehicles
By Arghyadeep on Aug 16, 2021 | 04:36 AM IST
U.S. vehicle safety regulators have opened a formal investigation into Tesla Inc’s Autopilot software after repeated collision instances were reported involving emergency vehicles.
The National Highway Traffic Safety Administration (NHTSA) said it had identified 11 crashes in which Tesla cars “have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes” since January 2018, which caused 17 injuries and one death.
It said among those 11 crashes, four crashes were from this year and had opened a preliminary evaluation of Autopilot in 2014-2021 Tesla Models Y, X, S, and 3.
“The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic-Aware Cruise Control during the approach to the crashes,” the NHTSA said in a document opening the investigation, which will cover about 765,000 Tesla cars in the U.S.
It said most incidents took place after dark, and the crash scenes encountered included measures like emergency vehicle lights, flares, or road cones.
The investigation “will assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation,” the regulators said.
The National Transportation Safety Board (NTSB) has criticized Tesla’s lack of system safeguards for its Autopilot software, allowing drivers to keep their hands off the wheel for extended periods.
In February last year, Andrej Karpathy, Tesla’s director of autonomous driving technology, said he had identified a challenge for the Autopilot system in recognizing parked police car’s emergency flashing lights.
In June, NHTSA said that it had started investigations into 30 instances of Tesla crashes involving ten deaths since 2016, where it is suspected that the vehicles were using advanced driver assistance systems.
“NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” the statement said.
“Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly.”
In January 2017, the regulators closed a preliminary evaluation into Tesla’s Autopilot system covering 43,000 vehicles without taking any action and said it “did not identify any defects in the design or performance” of Autopilot “nor any incidents in which the systems did not perform as designed.”
Last month, the influential nonprofit publication, Consumer Reports said its research revealed that Tesla’s “Full Self-Driving” software lacks safeguards, raising concerns on the safety of others in the road citing reports from drivers including “vehicles missing turns, scraping against bushes, and heading toward parked cars.”
Picture Credit: Thomas Bensmann