- Tesla is recalling 362,758 vehicles over issues with its Full Self-Driving software that allow vehicles to exceed speed limits or drive through intersections in an illegal or unpredictable manner, according to a filing with the National Highway Traffic Safety Administration (NHTSA).
- The issue affects a range of years across the full lineup, including certain Model 3, Model X, Model Y and Model S units produced between 2016 and 2023.
- Tesla said it will release a free over-the-air (OTA) software update for affected vehicles and will send a notification letter to owners by April 15, 2023.
Tesla is recalling hundreds of thousands of vehicles over safety issues with the company’s Full Self-Driving (FSD Beta) automated driving software. The recall affects a total of 362,758 vehicles including the Model 3, Model X, Model Y and certain Model S EVs manufactured between 2016 and 2023.
Filings with NHTSA indicate that vehicles using FSD Beta may behave in an unsafe manner, with particular concern at intersections. Vehicles may continue through an intersection while in the turn lane only, enter a stop sign-controlled intersection without coming to a complete stop, or proceed into an intersection during a steady yellow traffic signal without caution, according to NHTSA documents. The software may also fail to recognize changes in posted speed limits, and fail to slow down when entering slower traffic areas.
Tesla will release an over-the-air (OTA) software update for this problem for free. Owner notification letters are expected to be sent on April 15, 2023. Owners can contact Tesla customer service at 877–798–3752. The Tesla number for this recall is SB-23-00-001.
NHTSA’s Office of Defect Investigation opened a preliminary investigation into the FSD’s performance. The investigation was prompted by an accumulation of accidents in which Tesla vehicles, operating with Autopilot engaged, collided with first responders’ vehicles in a roadway or curb prone to an existing collision scene, according to NHTSA. The original initial assessment was then upgraded to an Engineering Analysis (EA) to extend the existing crash analysis, evaluate additional data sets, perform vehicle evaluations and to explore the extent to which Autopilot and related Tesla systems could exacerbate human factors or behavioral safety risks. by affecting the effectiveness of driver supervision.