The US National Highway Traffic Safety Administration asked Tesla Inc. for more information about one of its vehicles colliding with a fire truck in a fatal crash in the San Francisco Bay area.
(Bloomberg) — The US National Highway Traffic Safety Administration asked Tesla Inc. for more information about one of its vehicles colliding with a fire truck in a fatal crash in the San Francisco Bay area.
The agency reached out to the manufacturer after the incident in Contra Costa County during the Presidents’ Day holiday weekend. The county fire department said in a Feb. 18 tweet that a Tesla struck one of its trucks that was blocking lanes while responding to an earlier accident.
NHTSA has spent the last 18 months investigating how Tesla’s driver-assistance system Autopilot handles crash scenes involving fire trucks and other first-responder vehicles. It’s unclear whether the driver in the Contra Costa County incident — who was pronounced dead at the scene — was using Autopilot. A passenger in the Tesla and four firefighters also were transported to the hospital.
Read more: Tesla Robotaxis Musk Touted Years Ago Are Nowhere to Be Found
NHTSA opened the first of two active investigations into possible Autopilot defects in August 2021 after almost a dozen crashes with first-responder cars and trucks.
The following month, Tesla deployed an over-the-air update to its cars aimed at improving their ability to detect emergency vehicles. The company sent that software update without initiating a recall, leading NHTSA’s chief counsel and head of its vehicle defects division to publicly ask for technical and legal justification.
Since then, NHTSA opened a second defect investigation related to Autopilot, involving inadvertent braking, and escalated its probe into how the system handles crash scenes.
Autopilot and other driver-assistance systems can have a harder time detecting stationary vehicles and braking for them than navigating through traffic with other moving cars and trucks. In addition to scrutinizing this issue, NHTSA has been assessing Tesla’s methods for monitoring drivers using Autopilot and ensuring their engagement.
Related: Tesla Autopilot Probe Casts Eye on Role of In-Car Camera
Last week, Tesla recalled almost 363,000 cars that have installed software the company markets as Full Self-Driving Beta, which despite the name doesn’t render the vehicles autonomous. The company said in its recall notice that the feature could violate traffic laws before drivers — who are responsible for operating the vehicle at all times — are able to intervene.
(Updates with additional detail in the third paragraph.)
More stories like this are available on bloomberg.com
©2023 Bloomberg L.P.