The federal government’s top auto safety regulator is investigating why Tesla didn’t issue a recall last month when it updated the software in its cars to improve their ability to recognize stopped emergency vehicles such as police cars and fire engines.
The regulator, the National Highway Traffic Safety Administration, has also ordered Tesla to provide data on its Full Self-Driving software, which a small number of owners have tested on public roads.
The NHTSA opened a formal investigation over the summer into 12 accidents in which Tesla cars operating in Autopilot mode — a driver assistance system that can independently steer, brake and accelerate a car — failed to detect stopped emergency services whose lights flickered in low light.
In a letter to Tesla on Tuesday, the agency reminded the company that federal law requires automakers to initiate recalls if they find defects that pose a safety risk.
NHTSA told the company to provide detailed information about a software update shipped in late September that changed Autopilot and improved its ability to detect emergency lighting.
The letter told Tesla to indicate whether it plans to issue a recall regarding the update, and if not, any legal or technical reasons for refusing to do so.
“Any manufacturer that issues a wireless update that fixes a defect that poses an unreasonable risk to motor vehicle safety must submit an accompanying recall notice in a timely manner,” the agency said in the letter.
The letter was sent by Gregory Magno, the head of the NHTSA’s vehicle defects division in the Bureau of Defects Investigation, to Eddie Gates, Tesla’s director of field quality.
NHTSA has also ordered Tesla to provide the number of owners who have received Full Self-Driving software and copies of any agreements the company has with the owners. Tesla CEO Elon Musk has described Full Self-Driving as a technology that allows cars to drive autonomously in most conditions. But the software is not capable of driving a car without the active involvement of a human driver.