US regulators expanded a probe into Tesla’s “Autopilot” system, moving the investigation closer to a potential recall of a controversial feature in Elon Musk’s electric vehicles.
The National Highway Traffic Safety Administration is investigating whether “Autopilot and associated Tesla systems may exacerbate human factors or behavioural safety risks by undermining the effectiveness of the driver’s supervision,” according to a summary statement.
The agency now considers the probe an “engineering analysis” — which in NHTSA parlance upgrades the status from a “preliminary evaluation” — to determine “whether a safety recall should be initiated or the investigation should be closed.”
Tesla did not immediately respond to a request for comment.
NHTSA opened the probe in August 2021 after identifying 11 crashes involving a first responder vehicle and a Tesla in which Autopilot or Traffic Aware Cruise Control was engaged, and five additional cases were later found that fit into this group.
Additional forensic data on 11 of the incidents showed the drivers took no action to avert a crash between two and five seconds prior to impact, although they had their hands on the steering wheel.
The agency also probed more than 100 crashes not involving an emergency vehicle in which Tesla Autopilot or another driver-assistance system was engaged.
In about half of these cases, evidence suggests the driver was “insufficiently responsive” to driving conditions, NHTSA said.
Looking at a subset of 43 of those crashes that yielded more detailed data, NHTSA determined that in 37, the driver’s hands were on the steering wheel in the last second prior to the collision.
The automaker has defended the safety of the Autopilot feature, and say when used correctly it reduces the chance of an accident.
But NHTSA said, “A driver’s use or misuse of vehicle components … does not necessarily preclude a system defect” particularly “if the driver behavior in question is foreseeable in light of the system’s design.”
For all the latest Technology News Click Here