Complaints & Recalls
Official Safety Recalls - Important!
37 RecallsThese are official manufacturer recalls ordered by NHTSA for safety defects. If you own this vehicle, contact your dealer immediately for free repairs.
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2024 Cybertruck, 2017-2025 Model 3, and 2020-2025 Model Y vehicles. The tire pressure monitoring system (TPMS) warning light may not remain illuminated between drive cycles, failing to warn the driver of low tire pressure. As such, these vehicles fail to comply with the requirements of Federal Motor Vehicle Safety Standard number 138, "Tire Pressure Monitoring Systems."
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-24-00-018
Recall Date: Dec 17, 2024
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2012-2024 Model S, 2015-2024 Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles. In the event of an unbelted driver, the seat belt warning light and audible chime may not activate as intended. As such, these vehicles fail to comply with the requirements of Federal Motor Vehicle Safety Standard number 208, "Occupant Crash Protection."
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-24-00-008
Recall Date: May 28, 2024
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2013, 2018-2021 Model S, 2020-2021 Model X, 2018-2022 Model 3, and 2020-2022 Model Y vehicles. A factory reset muted the Pedestrian Warning System (PWS) sounds. As such, these vehicles fail to comply with the requirements of Federal Motor Vehicle Safety Standard number 141, "Minimum Sound Requirements for Hybrid and Electric Vehicles."
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-24-00-006
Recall Date: Feb 27, 2024
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2012-2023 Model S, 2016-2024 Model X, 2017-2023 Model 3, 2019-2024 Model Y, and 2024 Cybertruck vehicles. An incorrect font size is displayed on the instrument panel for the Brake, Park, and Antilock Brake System (ABS) warning lights. As such, these vehicles fail to comply with the requirements of Federal Motor Vehicle Safety Standard number 105, "Hydraulic and Electric Brake Systems" and 135, "Light Vehicle Brake Systems."
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-24-00-003
Recall Date: Jan 30, 2024
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling 2012-2023 Model S, 2016-2023 Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with all versions of Autosteer leading up to the version(s) that contains the recall remedy. In certain circumstances when Autosteer is engaged, the prominence and scope of the feature's controls may not be sufficient to prevent driver misuse of the SAE Level 2 advanced driver-assistance feature.
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-23-00-008
Recall Date: Dec 12, 2023
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2016-2023 Model S, Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with Full Self-Driving Beta (FSD Beta) software or pending installation. The FSD Beta system may allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution. In addition, the system may respond insufficiently to changes in posted speed limits or not adequately account for the driver's adjustment of the vehicle's speed to exceed posted speed limits.
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-23-00-001
Recall Date: Feb 15, 2023
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2017-2022 Model 3 vehicles. The second-row left seat belt buckle and second-row center seat belt anchor may have been incorrectly reassembled during vehicle service.
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-22-20-004
Recall Date: Oct 21, 2022
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2017-2022 Model 3, 2020-2022 Model Y, and 2021-2022 Model S and Model X vehicles. The window automatic reversal system may not react correctly after detecting an obstruction. As such, these vehicles fail to comply with the requirements of Federal Motor Vehicle Safety Standard number 118, "Power-Operated Window Systems."
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-22-00-013
Recall Date: Sep 19, 2022
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2018-2022 Model 3 Performance vehicles. The unit of speed (mph or km/h) may fail to display on the speedometer while in Track Mode. As such, these vehicles fail to comply with the requirements of Federal Motor Vehicle Safety Standard number 101, "Control and Displays."
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-22-00-008
Recall Date: Apr 18, 2022
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2020-2022 Model Y, Model X, Model S, and 2017-2022 Model 3 vehicles. The Boombox function allows sounds to be played through an external speaker while the vehicle is in motion, which may obscure the Pedestrian Warning System (PWS) sounds. As such, these vehicles fail to comply with the requirements of Federal Motor Vehicle Safety Standard number 141, "Minimum Sound Requirements for Hybrid and Electric Vehicles."
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-22-00-003
Recall Date: Apr 12, 2022
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2018-2019 Model S, Model X, and 2017-2020 Model 3 vehicles equipped with Autopilot Computer 2.5 and operating certain firmware releases. The rearview image may not immediately display when the vehicle begins to reverse. As such, these vehicles fail to comply with the requirements of Federal Motor Vehicle Safety Standard number 111, "Rear Visibility."
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-22-00-004
Recall Date: Mar 18, 2022
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2020-2022 Model S, Model X, Model Y, and 2017-2022 Model 3 vehicles. The Boombox function allows sounds to be played through an external speaker while the vehicle is in motion, which may obscure the Pedestrian Warning System (PWS) sounds. As such, these vehicles fail to comply with the requirements of Federal Motor Vehicle Safety Standard number 141, "Minimum Sound Requirements for Hybrid and Electric Vehicles."
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-22-00-003
Recall Date: Feb 4, 2022
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2021-2022 Model S and Model X, 2017-2022 Model 3, and 2020-2022 Model Y vehicles. The audible chime may not activate when the vehicle starts and the driver has not buckled their seat belt. As such, these vehicles fail to comply with the requirements of Federal Motor Vehicle Safety Standard number 208, "Occupant Crash Protection."
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-22-00-002
Recall Date: Feb 1, 2022
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2016-2022 Model S and Model X, 2017-2022 Model 3, and 2020-2022 Model Y vehicles. The "rolling stop" functionality available as part of the Full Self-Driving (Beta) software may allow the vehicle to travel through an all-way stop intersection without first coming to a stop.
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-22-00-001
Recall Date: Jan 27, 2022
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling all 2017-2020 Model 3 vehicles. The rearview camera cable harness may be damaged by the opening and closing of the trunk lid, preventing the rearview camera image from displaying.
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-21-17-008
Recall Date: Dec 21, 2021
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2017-2021 Model S, Model 3, Model X, and 2020-2021 Model Y vehicles operating software version 2021.36.5.2. A communication error may cause false forward-collision warning (FCW) or unexpected activation of the automatic emergency brake (AEB) system.
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-21-00-004
Recall Date: Oct 29, 2021
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2020-2021, 2023 Model 3 vehicles. The left and/or right side curtain air bag may have been improperly secured to the roof rail, which could result in a twisted air bag. As such, these vehicles fail to comply with the requirements of Federal Motor Vehicle Safety Standard numbers 214, "Side Impact Protection" and 226, "Ejection Mitigation."
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-21-20-006
Recall Date: Oct 25, 2021
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2020-2021 Model Y and 2019-2021 Model 3 vehicles. The front suspension lateral link fasteners may loosen, allowing the lateral link to separate from the sub-frame.
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.nhtsa.gov.
Mfg Campaign: SB-21-31-003
Recall Date: Oct 25, 2021
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2019-2021 Model 3 and 2020-2021 Model Y vehicles. The brake caliper bolts may be loose, allowing the brake caliper to separate and contact the wheel rim.
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.safercar.gov.
Mfg Campaign: SB-21-33-002
Recall Date: May 25, 2021
Tesla, Inc.
Safety Issue:
Tesla, Inc. (Tesla) is recalling certain 2018-2020 Model 3 and 2019-2021 Model Y vehicles. One or both fasteners that secure the front seat shoulder belt to the b-pillar may not be properly attached.
Potential Risk:
FREE Recall Solution:
Additional Details:
Owners may also contact the National Highway Traffic Safety Administration Vehicle Safety Hotline at 1-888-327-4236 (TTY 1-800-424-9153), or go to www.safercar.gov.
Mfg Campaign: SB-21-20-001
Recall Date: May 25, 2021
Consumer Complaints
681 ComplaintsTesla, Inc.
Defect Description:
FORWARD COLLISION AVOIDANCE: AUTOMATIC EMERGENCY BRAKING
Potential Consequences:
While operating TACC with lane keeping assist, I experienced multiple iterations of phantom braking when the road and weather was clear ahead. This phantom braking could potentially cause other drivers to collide with the vehicle that unexpectedly braked in the middle of the highway. This problem has been a common issue with multiple people taking their cars to the service center to get checked out with no problem identified. There are no warning lights or indicators that the Tesla is going to phantom brake on the highway. Recommend having Tesla provide information regarding this phantom braking.
Corrective Action:
While operating TACC with lane keeping assist, I experienced multiple iterations of phantom braking when the road and weather was clear ahead. This phantom braking could potentially cause other drivers to collide with the vehicle that unexpectedly braked in the middle of the highway. This problem has been a common issue with multiple people taking their cars to the service center to get checked out with no problem identified. There are no warning lights or indicators that the Tesla is going to phantom brake on the highway. Recommend having Tesla provide information regarding this phantom braking.
Additional Notes:
While operating TACC with lane keeping assist, I experienced multiple iterations of phantom braking when the road and weather was clear ahead. This phantom braking could potentially cause other drivers to collide with the vehicle that unexpectedly braked in the middle of the highway. This problem has been a common issue with multiple people taking their cars to the service center to get checked out with no problem identified. There are no warning lights or indicators that the Tesla is going to phantom brake on the highway. Recommend having Tesla provide information regarding this phantom braking.
Mfg Campaign: 11561858
Recall Date: Dec 26, 2023
Tesla, Inc.
Defect Description:
FORWARD COLLISION AVOIDANCE: ADAPTIVE CRUISE CONTROL
Potential Consequences:
At 11:47 AM Pacific time on December 23, 2023 my Tesla model three auto pilot system failed after a recent update mandated by the NHTSA. Ever since the update, the vehicle, cruise control and auto pilot systems no longer function properly, and while I was in, cruise control and auto pilot traffic stopped ahead and the auto pilot system disengaged while I was waiting while it was slowing down, which has never happened in four years of ownership and over 60,000 miles of driving. The new recall software update has made the vehicle unsafe and it should be retracted and corrected. What once was a safe feature is now unsafe. Please investigate this with Tesla to prevent possible injury or death resulting from this programming failure.
Corrective Action:
At 11:47 AM Pacific time on December 23, 2023 my Tesla model three auto pilot system failed after a recent update mandated by the NHTSA. Ever since the update, the vehicle, cruise control and auto pilot systems no longer function properly, and while I was in, cruise control and auto pilot traffic stopped ahead and the auto pilot system disengaged while I was waiting while it was slowing down, which has never happened in four years of ownership and over 60,000 miles of driving. The new recall software update has made the vehicle unsafe and it should be retracted and corrected. What once was a safe feature is now unsafe. Please investigate this with Tesla to prevent possible injury or death resulting from this programming failure.
Additional Notes:
At 11:47 AM Pacific time on December 23, 2023 my Tesla model three auto pilot system failed after a recent update mandated by the NHTSA. Ever since the update, the vehicle, cruise control and auto pilot systems no longer function properly, and while I was in, cruise control and auto pilot traffic stopped ahead and the auto pilot system disengaged while I was waiting while it was slowing down, which has never happened in four years of ownership and over 60,000 miles of driving. The new recall software update has made the vehicle unsafe and it should be retracted and corrected. What once was a safe feature is now unsafe. Please investigate this with Tesla to prevent possible injury or death resulting from this programming failure.
Mfg Campaign: 11561638
Recall Date: Dec 23, 2023
Tesla, Inc.
Defect Description:
LANE DEPARTURE: ASSIST
Potential Consequences:
Your latest mandates has arguably made the car less safe by removing what allowed the system to operate safely. Please roll back this “recall” it’s not a recall in the slightest it’s a shot at someone or multiple people that dislike Tesla.
Corrective Action:
Your latest mandates has arguably made the car less safe by removing what allowed the system to operate safely. Please roll back this “recall” it’s not a recall in the slightest it’s a shot at someone or multiple people that dislike Tesla.
Additional Notes:
Your latest mandates has arguably made the car less safe by removing what allowed the system to operate safely. Please roll back this “recall” it’s not a recall in the slightest it’s a shot at someone or multiple people that dislike Tesla.
Mfg Campaign: 11561228
Recall Date: Dec 21, 2023
Tesla, Inc.
Defect Description:
LANE DEPARTURE: ASSIST
Potential Consequences:
An incident occurred in April of 2023 while running FSD Beta 11.3.6 firmware. I have been unable to retest this location on more recent software, so this concern may have been mitigated with a subsequent update. As seen in the video, the yellow lane marking on the left transitioned from the roadway up onto the movable center median. Once on the bridge, the left repeater camera and the center display visualization both illustrate the lane centering was definitely biased towards the left side of the lane, a possible indication the vision-based lane centering was miscued by the yellow line now appearing on the barrier instead of the road surface. This left bias brought the vehicle uncomfortably close to the barrier, which resulted in a manual takeover. It is difficult to judge based on the low resolution of the repeater camera view, but my estimation is that the tires were approximately 12 inches away from the protruding bottom lip of the barrier. [XXX] This same left lane marking concern also occurs at the entrance to the Oakland, California Caldecott tunnel eastbound bore. The Google maps link provided shows the yellow line which defines the left lane edge disappears at the tunnel entrance, at which time the vision-based lane centering seemed to be cuing off of the curb as the lane edge, rather than there being a marking that would delineate a lane edge further away from the curb. An FSD beta user (name and date unavailable) reported contacting the curb at this location at approximately the same date as the GGB recording. [XXX] The GGB matter was brought to the attention of the Bay Area Toll Authority with the suggestion they contact Tesla. I have been unable to verify that the Toll Authority followed up on that suggestion. The release notes for Beta 11.4.4 suggest the issue might have been addressed, to wit, "Improved offset consistency when controlling for static obstacles." INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
Corrective Action:
An incident occurred in April of 2023 while running FSD Beta 11.3.6 firmware. I have been unable to retest this location on more recent software, so this concern may have been mitigated with a subsequent update. As seen in the video, the yellow lane marking on the left transitioned from the roadway up onto the movable center median. Once on the bridge, the left repeater camera and the center display visualization both illustrate the lane centering was definitely biased towards the left side of the lane, a possible indication the vision-based lane centering was miscued by the yellow line now appearing on the barrier instead of the road surface. This left bias brought the vehicle uncomfortably close to the barrier, which resulted in a manual takeover. It is difficult to judge based on the low resolution of the repeater camera view, but my estimation is that the tires were approximately 12 inches away from the protruding bottom lip of the barrier. [XXX] This same left lane marking concern also occurs at the entrance to the Oakland, California Caldecott tunnel eastbound bore. The Google maps link provided shows the yellow line which defines the left lane edge disappears at the tunnel entrance, at which time the vision-based lane centering seemed to be cuing off of the curb as the lane edge, rather than there being a marking that would delineate a lane edge further away from the curb. An FSD beta user (name and date unavailable) reported contacting the curb at this location at approximately the same date as the GGB recording. [XXX] The GGB matter was brought to the attention of the Bay Area Toll Authority with the suggestion they contact Tesla. I have been unable to verify that the Toll Authority followed up on that suggestion. The release notes for Beta 11.4.4 suggest the issue might have been addressed, to wit, "Improved offset consistency when controlling for static obstacles." INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
Additional Notes:
An incident occurred in April of 2023 while running FSD Beta 11.3.6 firmware. I have been unable to retest this location on more recent software, so this concern may have been mitigated with a subsequent update. As seen in the video, the yellow lane marking on the left transitioned from the roadway up onto the movable center median. Once on the bridge, the left repeater camera and the center display visualization both illustrate the lane centering was definitely biased towards the left side of the lane, a possible indication the vision-based lane centering was miscued by the yellow line now appearing on the barrier instead of the road surface. This left bias brought the vehicle uncomfortably close to the barrier, which resulted in a manual takeover. It is difficult to judge based on the low resolution of the repeater camera view, but my estimation is that the tires were approximately 12 inches away from the protruding bottom lip of the barrier. [XXX] This same left lane marking concern also occurs at the entrance to the Oakland, California Caldecott tunnel eastbound bore. The Google maps link provided shows the yellow line which defines the left lane edge disappears at the tunnel entrance, at which time the vision-based lane centering seemed to be cuing off of the curb as the lane edge, rather than there being a marking that would delineate a lane edge further away from the curb. An FSD beta user (name and date unavailable) reported contacting the curb at this location at approximately the same date as the GGB recording. [XXX] The GGB matter was brought to the attention of the Bay Area Toll Authority with the suggestion they contact Tesla. I have been unable to verify that the Toll Authority followed up on that suggestion. The release notes for Beta 11.4.4 suggest the issue might have been addressed, to wit, "Improved offset consistency when controlling for static obstacles." INFORMATION REDACTED PURSUANT TO THE FREEDOM OF INFORMATION ACT (FOIA), 5 U.S.C. 552(B)(6)
Mfg Campaign: 11561104
Recall Date: Dec 20, 2023
Tesla, Inc.
Defect Description:
UNKNOWN OR OTHER
Potential Consequences:
cabin heat not working, vehicle’s defroster is also not work well enough to comply with federal motor vehicle safety standards regarding windshield visibility. error message on the vehicle’s user interface saying that heating and air conditioning are limited or unavailable. The blower motor, which sends air into the cabin, will remain operational.
Corrective Action:
cabin heat not working, vehicle’s defroster is also not work well enough to comply with federal motor vehicle safety standards regarding windshield visibility. error message on the vehicle’s user interface saying that heating and air conditioning are limited or unavailable. The blower motor, which sends air into the cabin, will remain operational.
Additional Notes:
cabin heat not working, vehicle’s defroster is also not work well enough to comply with federal motor vehicle safety standards regarding windshield visibility. error message on the vehicle’s user interface saying that heating and air conditioning are limited or unavailable. The blower motor, which sends air into the cabin, will remain operational.
Mfg Campaign: 11560990
Recall Date: Dec 19, 2023
Tesla, Inc.
Defect Description:
UNKNOWN OR OTHER
Potential Consequences:
To Whom It May Concern at the National Highway Traffic Safety Administration (NHTSA), I am writing to urgently express my concerns about the latest software update for the Tesla Model 3’s autopilot system, which, in my view, significantly compromises safety. As a Model 3 owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. I urge the NHTSA to investigate this issue as a matter of urgency and safety.
Corrective Action:
To Whom It May Concern at the National Highway Traffic Safety Administration (NHTSA), I am writing to urgently express my concerns about the latest software update for the Tesla Model 3’s autopilot system, which, in my view, significantly compromises safety. As a Model 3 owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. I urge the NHTSA to investigate this issue as a matter of urgency and safety.
Additional Notes:
To Whom It May Concern at the National Highway Traffic Safety Administration (NHTSA), I am writing to urgently express my concerns about the latest software update for the Tesla Model 3’s autopilot system, which, in my view, significantly compromises safety. As a Model 3 owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. I urge the NHTSA to investigate this issue as a matter of urgency and safety.
Mfg Campaign: 11561000
Recall Date: Dec 19, 2023
Tesla, Inc.
Defect Description:
STEERING
Potential Consequences:
To Whom It May Concern at the National Highway Traffic Safety Administration (NHTSA), I am writing to urgently express my concerns about the latest software update for the Tesla Model 3’s autopilot system, which, in my view, significantly compromises safety. As a Model 3 owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. I urge the NHTSA to investigate this issue as a matter of urgency and safety.
Corrective Action:
To Whom It May Concern at the National Highway Traffic Safety Administration (NHTSA), I am writing to urgently express my concerns about the latest software update for the Tesla Model 3’s autopilot system, which, in my view, significantly compromises safety. As a Model 3 owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. I urge the NHTSA to investigate this issue as a matter of urgency and safety.
Additional Notes:
To Whom It May Concern at the National Highway Traffic Safety Administration (NHTSA), I am writing to urgently express my concerns about the latest software update for the Tesla Model 3’s autopilot system, which, in my view, significantly compromises safety. As a Model 3 owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. I urge the NHTSA to investigate this issue as a matter of urgency and safety.
Mfg Campaign: 11561000
Recall Date: Dec 19, 2023
Tesla, Inc.
Defect Description:
LANE DEPARTURE: ASSIST
Potential Consequences:
I am writing to urgently express my concerns about the latest software update for the Tesla Model 3’s autopilot system, which, in my view, significantly compromises safety. As a Model 3 owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. You need to investigate this issue as a matter of urgency. Adjustments are necessary to prevent potential accidents and ensure that the tech helps, not hinders.
Corrective Action:
I am writing to urgently express my concerns about the latest software update for the Tesla Model 3’s autopilot system, which, in my view, significantly compromises safety. As a Model 3 owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. You need to investigate this issue as a matter of urgency. Adjustments are necessary to prevent potential accidents and ensure that the tech helps, not hinders.
Additional Notes:
I am writing to urgently express my concerns about the latest software update for the Tesla Model 3’s autopilot system, which, in my view, significantly compromises safety. As a Model 3 owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. You need to investigate this issue as a matter of urgency. Adjustments are necessary to prevent potential accidents and ensure that the tech helps, not hinders.
Mfg Campaign: 11561009
Recall Date: Dec 19, 2023
Tesla, Inc.
Defect Description:
FORWARD COLLISION AVOIDANCE: AUTOMATIC EMERGENCY BRAKING
Potential Consequences:
During a month-long road trip in the US, I experienced frequent "phantom braking" incidents while using adaptive cruise control. In some cases, Autosteer was also active. By "phantom braking" I refer to sudden hard braking by the car itself in clear daylight conditions on interstate highways with no vehicles visible ahead of me for at least 1/2 mile. By "frequent" I mean that this behavior occurred multiple times per hour over multiple days. I decided to cease using cruise control for the remainder of the trip after the following episode occurred. With cruise control active on a four lane interstate, I was changing lanes to the right to allow a pickup truck to pass me on the left. I had just passed a semitrailer, who was in the right lane. I could see him in my rear-view mirror as I began the lane change. The road was empty and visible for at least a mile ahead on a sunny morning. We were all traveling about 75 mph in a 70 mph zone. While straddling the lanes, the car suddenly slammed on its brakes, requiring both vehicles behind me to take evasive action. I over-rode the car’s action by pressing the accelerator pedal swiftly and firmly. No accident occurred. I stopped using cruise control entirely for several months, and I have only rarely used it since then, and only when the road behind me is empty. In those few recent trials, I have not experienced any phantom braking. Finally, I speculate that the braking may have been triggered by "mirages" of what look like puddles on the road. We humans are familiar with this phenomenon, but the camera AI may have interpreted those as obstacles.
Corrective Action:
During a month-long road trip in the US, I experienced frequent "phantom braking" incidents while using adaptive cruise control. In some cases, Autosteer was also active. By "phantom braking" I refer to sudden hard braking by the car itself in clear daylight conditions on interstate highways with no vehicles visible ahead of me for at least 1/2 mile. By "frequent" I mean that this behavior occurred multiple times per hour over multiple days. I decided to cease using cruise control for the remainder of the trip after the following episode occurred. With cruise control active on a four lane interstate, I was changing lanes to the right to allow a pickup truck to pass me on the left. I had just passed a semitrailer, who was in the right lane. I could see him in my rear-view mirror as I began the lane change. The road was empty and visible for at least a mile ahead on a sunny morning. We were all traveling about 75 mph in a 70 mph zone. While straddling the lanes, the car suddenly slammed on its brakes, requiring both vehicles behind me to take evasive action. I over-rode the car’s action by pressing the accelerator pedal swiftly and firmly. No accident occurred. I stopped using cruise control entirely for several months, and I have only rarely used it since then, and only when the road behind me is empty. In those few recent trials, I have not experienced any phantom braking. Finally, I speculate that the braking may have been triggered by "mirages" of what look like puddles on the road. We humans are familiar with this phenomenon, but the camera AI may have interpreted those as obstacles.
Additional Notes:
During a month-long road trip in the US, I experienced frequent "phantom braking" incidents while using adaptive cruise control. In some cases, Autosteer was also active. By "phantom braking" I refer to sudden hard braking by the car itself in clear daylight conditions on interstate highways with no vehicles visible ahead of me for at least 1/2 mile. By "frequent" I mean that this behavior occurred multiple times per hour over multiple days. I decided to cease using cruise control for the remainder of the trip after the following episode occurred. With cruise control active on a four lane interstate, I was changing lanes to the right to allow a pickup truck to pass me on the left. I had just passed a semitrailer, who was in the right lane. I could see him in my rear-view mirror as I began the lane change. The road was empty and visible for at least a mile ahead on a sunny morning. We were all traveling about 75 mph in a 70 mph zone. While straddling the lanes, the car suddenly slammed on its brakes, requiring both vehicles behind me to take evasive action. I over-rode the car’s action by pressing the accelerator pedal swiftly and firmly. No accident occurred. I stopped using cruise control entirely for several months, and I have only rarely used it since then, and only when the road behind me is empty. In those few recent trials, I have not experienced any phantom braking. Finally, I speculate that the braking may have been triggered by "mirages" of what look like puddles on the road. We humans are familiar with this phenomenon, but the camera AI may have interpreted those as obstacles.
Mfg Campaign: 11560404
Recall Date: Dec 16, 2023
Tesla, Inc.
Defect Description:
FORWARD COLLISION AVOIDANCE: ADAPTIVE CRUISE CONTROL
Potential Consequences:
During a month-long road trip in the US, I experienced frequent "phantom braking" incidents while using adaptive cruise control. In some cases, Autosteer was also active. By "phantom braking" I refer to sudden hard braking by the car itself in clear daylight conditions on interstate highways with no vehicles visible ahead of me for at least 1/2 mile. By "frequent" I mean that this behavior occurred multiple times per hour over multiple days. I decided to cease using cruise control for the remainder of the trip after the following episode occurred. With cruise control active on a four lane interstate, I was changing lanes to the right to allow a pickup truck to pass me on the left. I had just passed a semitrailer, who was in the right lane. I could see him in my rear-view mirror as I began the lane change. The road was empty and visible for at least a mile ahead on a sunny morning. We were all traveling about 75 mph in a 70 mph zone. While straddling the lanes, the car suddenly slammed on its brakes, requiring both vehicles behind me to take evasive action. I over-rode the car’s action by pressing the accelerator pedal swiftly and firmly. No accident occurred. I stopped using cruise control entirely for several months, and I have only rarely used it since then, and only when the road behind me is empty. In those few recent trials, I have not experienced any phantom braking. Finally, I speculate that the braking may have been triggered by "mirages" of what look like puddles on the road. We humans are familiar with this phenomenon, but the camera AI may have interpreted those as obstacles.
Corrective Action:
During a month-long road trip in the US, I experienced frequent "phantom braking" incidents while using adaptive cruise control. In some cases, Autosteer was also active. By "phantom braking" I refer to sudden hard braking by the car itself in clear daylight conditions on interstate highways with no vehicles visible ahead of me for at least 1/2 mile. By "frequent" I mean that this behavior occurred multiple times per hour over multiple days. I decided to cease using cruise control for the remainder of the trip after the following episode occurred. With cruise control active on a four lane interstate, I was changing lanes to the right to allow a pickup truck to pass me on the left. I had just passed a semitrailer, who was in the right lane. I could see him in my rear-view mirror as I began the lane change. The road was empty and visible for at least a mile ahead on a sunny morning. We were all traveling about 75 mph in a 70 mph zone. While straddling the lanes, the car suddenly slammed on its brakes, requiring both vehicles behind me to take evasive action. I over-rode the car’s action by pressing the accelerator pedal swiftly and firmly. No accident occurred. I stopped using cruise control entirely for several months, and I have only rarely used it since then, and only when the road behind me is empty. In those few recent trials, I have not experienced any phantom braking. Finally, I speculate that the braking may have been triggered by "mirages" of what look like puddles on the road. We humans are familiar with this phenomenon, but the camera AI may have interpreted those as obstacles.
Additional Notes:
During a month-long road trip in the US, I experienced frequent "phantom braking" incidents while using adaptive cruise control. In some cases, Autosteer was also active. By "phantom braking" I refer to sudden hard braking by the car itself in clear daylight conditions on interstate highways with no vehicles visible ahead of me for at least 1/2 mile. By "frequent" I mean that this behavior occurred multiple times per hour over multiple days. I decided to cease using cruise control for the remainder of the trip after the following episode occurred. With cruise control active on a four lane interstate, I was changing lanes to the right to allow a pickup truck to pass me on the left. I had just passed a semitrailer, who was in the right lane. I could see him in my rear-view mirror as I began the lane change. The road was empty and visible for at least a mile ahead on a sunny morning. We were all traveling about 75 mph in a 70 mph zone. While straddling the lanes, the car suddenly slammed on its brakes, requiring both vehicles behind me to take evasive action. I over-rode the car’s action by pressing the accelerator pedal swiftly and firmly. No accident occurred. I stopped using cruise control entirely for several months, and I have only rarely used it since then, and only when the road behind me is empty. In those few recent trials, I have not experienced any phantom braking. Finally, I speculate that the braking may have been triggered by "mirages" of what look like puddles on the road. We humans are familiar with this phenomenon, but the camera AI may have interpreted those as obstacles.
Mfg Campaign: 11560404
Recall Date: Dec 16, 2023
Tesla, Inc.
Defect Description:
FUEL/PROPULSION SYSTEM
Potential Consequences:
We had Model 3 with instance of unintended acceleration at the gate of a parking lot. This car was bought in May 2020. On December 6, 2020, she was on the way out of the parking gate and suddenly the car accelerated, hitting the side wall of the gate. The car flew over a two-way street, resulting in a compound fracture of her right hip requiring internal fixation and transfusion. The vehicle was inspected by the police. There were no warning lamps, messages, or other symptoms prior to failure.
Corrective Action:
We had Model 3 with instance of unintended acceleration at the gate of a parking lot. This car was bought in May 2020. On December 6, 2020, she was on the way out of the parking gate and suddenly the car accelerated, hitting the side wall of the gate. The car flew over a two-way street, resulting in a compound fracture of her right hip requiring internal fixation and transfusion. The vehicle was inspected by the police. There were no warning lamps, messages, or other symptoms prior to failure.
Additional Notes:
We had Model 3 with instance of unintended acceleration at the gate of a parking lot. This car was bought in May 2020. On December 6, 2020, she was on the way out of the parking gate and suddenly the car accelerated, hitting the side wall of the gate. The car flew over a two-way street, resulting in a compound fracture of her right hip requiring internal fixation and transfusion. The vehicle was inspected by the police. There were no warning lamps, messages, or other symptoms prior to failure.
Mfg Campaign: 11560002
Recall Date: Dec 13, 2023
Tesla, Inc.
Defect Description:
POWER TRAIN
Potential Consequences:
We had Model 3 with instance of unintended acceleration at the gate of a parking lot. This car was bought in May 2020. On December 6, 2020, she was on the way out of the parking gate and suddenly the car accelerated, hitting the side wall of the gate. The car flew over a two-way street, resulting in a compound fracture of her right hip requiring internal fixation and transfusion. The vehicle was inspected by the police. There were no warning lamps, messages, or other symptoms prior to failure.
Corrective Action:
We had Model 3 with instance of unintended acceleration at the gate of a parking lot. This car was bought in May 2020. On December 6, 2020, she was on the way out of the parking gate and suddenly the car accelerated, hitting the side wall of the gate. The car flew over a two-way street, resulting in a compound fracture of her right hip requiring internal fixation and transfusion. The vehicle was inspected by the police. There were no warning lamps, messages, or other symptoms prior to failure.
Additional Notes:
We had Model 3 with instance of unintended acceleration at the gate of a parking lot. This car was bought in May 2020. On December 6, 2020, she was on the way out of the parking gate and suddenly the car accelerated, hitting the side wall of the gate. The car flew over a two-way street, resulting in a compound fracture of her right hip requiring internal fixation and transfusion. The vehicle was inspected by the police. There were no warning lamps, messages, or other symptoms prior to failure.
Mfg Campaign: 11560002
Recall Date: Dec 13, 2023
Tesla, Inc.
Defect Description:
VEHICLE SPEED CONTROL
Potential Consequences:
We had Model 3 with instance of unintended acceleration at the gate of a parking lot. This car was bought in May 2020. On December 6, 2020, she was on the way out of the parking gate and suddenly the car accelerated, hitting the side wall of the gate. The car flew over a two-way street, resulting in a compound fracture of her right hip requiring internal fixation and transfusion. The vehicle was inspected by the police. There were no warning lamps, messages, or other symptoms prior to failure.
Corrective Action:
We had Model 3 with instance of unintended acceleration at the gate of a parking lot. This car was bought in May 2020. On December 6, 2020, she was on the way out of the parking gate and suddenly the car accelerated, hitting the side wall of the gate. The car flew over a two-way street, resulting in a compound fracture of her right hip requiring internal fixation and transfusion. The vehicle was inspected by the police. There were no warning lamps, messages, or other symptoms prior to failure.
Additional Notes:
We had Model 3 with instance of unintended acceleration at the gate of a parking lot. This car was bought in May 2020. On December 6, 2020, she was on the way out of the parking gate and suddenly the car accelerated, hitting the side wall of the gate. The car flew over a two-way street, resulting in a compound fracture of her right hip requiring internal fixation and transfusion. The vehicle was inspected by the police. There were no warning lamps, messages, or other symptoms prior to failure.
Mfg Campaign: 11560002
Recall Date: Dec 13, 2023
Tesla, Inc.
Defect Description:
FORWARD COLLISION AVOIDANCE: ADAPTIVE CRUISE CONTROL
Potential Consequences:
Phantom breaking occurred twice while driving east on highway sr 152. Our car was traveling straight with no cars in front of us. I was using cruise control set at the speed limit when the car suddenly slammed on the brakes, the car slowed from 65 mph to 20 mph in a few seconds. This happened twice in a matter of 10 minutes. I no longer use cruise control. Apparently there is no fix. I have two relatives that have experienced same problem with their Teslas. I will suggest that they report to NHTSA.
Corrective Action:
Phantom breaking occurred twice while driving east on highway sr 152. Our car was traveling straight with no cars in front of us. I was using cruise control set at the speed limit when the car suddenly slammed on the brakes, the car slowed from 65 mph to 20 mph in a few seconds. This happened twice in a matter of 10 minutes. I no longer use cruise control. Apparently there is no fix. I have two relatives that have experienced same problem with their Teslas. I will suggest that they report to NHTSA.
Additional Notes:
Phantom breaking occurred twice while driving east on highway sr 152. Our car was traveling straight with no cars in front of us. I was using cruise control set at the speed limit when the car suddenly slammed on the brakes, the car slowed from 65 mph to 20 mph in a few seconds. This happened twice in a matter of 10 minutes. I no longer use cruise control. Apparently there is no fix. I have two relatives that have experienced same problem with their Teslas. I will suggest that they report to NHTSA.
Mfg Campaign: 11559375
Recall Date: Dec 10, 2023
Tesla, Inc.
Defect Description:
VISIBILITY/WIPER
Potential Consequences:
Currently occurring with FSD Beta firmware V.2022.27.12 but the same has occurred with previous firmware versions also. Normal operation of the vehicle with the ADAS engaged puts the windshield wipers into AUTO mode. At times the windshield wipers are activated when the windshield is dry. It isn't raining or misting and there's nothing on the surface of the windshield at the location of the front-facing cameras. "No rain" occurrences seem to correlate with times when visibility ahead has low contrast and there are no highlights and shadows to sharply define the scene, (ie, a little "muddy"). Observed when it is overcast but not raining, around sunrise when the sun is low in the sky and the light levels ahead are relatively low and headlights aren't helping illuminate the road ahead, and at times there are high-contrast light and shadows immediately ahead but farther down the road there is mist or fog, creating the same low contrast lighting conditions some distance away. It is as if the ADAS is aware that the scenery is not sharply defined and is "wiping its eyes" in an attempt to gain a more sharply-defined view. Occasionally, but rarely in my experience, the windshield washer is also engaged. Having the ADAS activate the wipers at these times is not a safety hazard, per se, but it is distracting and concerning as it gives the driver the impression that perhaps the ADAS is malfunctioning. I've read in user forums reports from a fair number of FSD Beta drivers complaining that the wipers go on when the windshield is dry and the ADAS is engaged. I haven't observed the same when the ADAS is not engaged.
Corrective Action:
Currently occurring with FSD Beta firmware V.2022.27.12 but the same has occurred with previous firmware versions also. Normal operation of the vehicle with the ADAS engaged puts the windshield wipers into AUTO mode. At times the windshield wipers are activated when the windshield is dry. It isn't raining or misting and there's nothing on the surface of the windshield at the location of the front-facing cameras. "No rain" occurrences seem to correlate with times when visibility ahead has low contrast and there are no highlights and shadows to sharply define the scene, (ie, a little "muddy"). Observed when it is overcast but not raining, around sunrise when the sun is low in the sky and the light levels ahead are relatively low and headlights aren't helping illuminate the road ahead, and at times there are high-contrast light and shadows immediately ahead but farther down the road there is mist or fog, creating the same low contrast lighting conditions some distance away. It is as if the ADAS is aware that the scenery is not sharply defined and is "wiping its eyes" in an attempt to gain a more sharply-defined view. Occasionally, but rarely in my experience, the windshield washer is also engaged. Having the ADAS activate the wipers at these times is not a safety hazard, per se, but it is distracting and concerning as it gives the driver the impression that perhaps the ADAS is malfunctioning. I've read in user forums reports from a fair number of FSD Beta drivers complaining that the wipers go on when the windshield is dry and the ADAS is engaged. I haven't observed the same when the ADAS is not engaged.
Additional Notes:
Currently occurring with FSD Beta firmware V.2022.27.12 but the same has occurred with previous firmware versions also. Normal operation of the vehicle with the ADAS engaged puts the windshield wipers into AUTO mode. At times the windshield wipers are activated when the windshield is dry. It isn't raining or misting and there's nothing on the surface of the windshield at the location of the front-facing cameras. "No rain" occurrences seem to correlate with times when visibility ahead has low contrast and there are no highlights and shadows to sharply define the scene, (ie, a little "muddy"). Observed when it is overcast but not raining, around sunrise when the sun is low in the sky and the light levels ahead are relatively low and headlights aren't helping illuminate the road ahead, and at times there are high-contrast light and shadows immediately ahead but farther down the road there is mist or fog, creating the same low contrast lighting conditions some distance away. It is as if the ADAS is aware that the scenery is not sharply defined and is "wiping its eyes" in an attempt to gain a more sharply-defined view. Occasionally, but rarely in my experience, the windshield washer is also engaged. Having the ADAS activate the wipers at these times is not a safety hazard, per se, but it is distracting and concerning as it gives the driver the impression that perhaps the ADAS is malfunctioning. I've read in user forums reports from a fair number of FSD Beta drivers complaining that the wipers go on when the windshield is dry and the ADAS is engaged. I haven't observed the same when the ADAS is not engaged.
Mfg Campaign: 11558753
Recall Date: Dec 6, 2023
Tesla, Inc.
Defect Description:
FORWARD COLLISION AVOIDANCE: AUTOMATIC EMERGENCY BRAKING
Potential Consequences:
At lease several times in the past 3 months, when using AUTOPILOT and when passing semi-trucks on my right my Model 3 has automatically given a very loud, what I believe to be either a Forward Collision Warning or a Lane Departure Warning, then braked and seemingly veered to the left! This is scary beyond belief and has only happened when AUTOPILOT is engaged and large semi-trucks are to my right. I thought you should know about this issue. Thank you,
Corrective Action:
At lease several times in the past 3 months, when using AUTOPILOT and when passing semi-trucks on my right my Model 3 has automatically given a very loud, what I believe to be either a Forward Collision Warning or a Lane Departure Warning, then braked and seemingly veered to the left! This is scary beyond belief and has only happened when AUTOPILOT is engaged and large semi-trucks are to my right. I thought you should know about this issue. Thank you,
Additional Notes:
At lease several times in the past 3 months, when using AUTOPILOT and when passing semi-trucks on my right my Model 3 has automatically given a very loud, what I believe to be either a Forward Collision Warning or a Lane Departure Warning, then braked and seemingly veered to the left! This is scary beyond belief and has only happened when AUTOPILOT is engaged and large semi-trucks are to my right. I thought you should know about this issue. Thank you,
Mfg Campaign: 11557035
Recall Date: Nov 27, 2023
Tesla, Inc.
Defect Description:
FORWARD COLLISION AVOIDANCE: WARNINGS
Potential Consequences:
At lease several times in the past 3 months, when using AUTOPILOT and when passing semi-trucks on my right my Model 3 has automatically given a very loud, what I believe to be either a Forward Collision Warning or a Lane Departure Warning, then braked and seemingly veered to the left! This is scary beyond belief and has only happened when AUTOPILOT is engaged and large semi-trucks are to my right. I thought you should know about this issue. Thank you,
Corrective Action:
At lease several times in the past 3 months, when using AUTOPILOT and when passing semi-trucks on my right my Model 3 has automatically given a very loud, what I believe to be either a Forward Collision Warning or a Lane Departure Warning, then braked and seemingly veered to the left! This is scary beyond belief and has only happened when AUTOPILOT is engaged and large semi-trucks are to my right. I thought you should know about this issue. Thank you,
Additional Notes:
At lease several times in the past 3 months, when using AUTOPILOT and when passing semi-trucks on my right my Model 3 has automatically given a very loud, what I believe to be either a Forward Collision Warning or a Lane Departure Warning, then braked and seemingly veered to the left! This is scary beyond belief and has only happened when AUTOPILOT is engaged and large semi-trucks are to my right. I thought you should know about this issue. Thank you,
Mfg Campaign: 11557035
Recall Date: Nov 27, 2023
Tesla, Inc.
Defect Description:
SERVICE BRAKES
Potential Consequences:
While driving on the highway from philadelphia to allentown using I-476 north i was driving using cruise control and the car suddenly applied brakes ( phantom braking) there was no one either in front or sides or behind me. No accident happened
Corrective Action:
While driving on the highway from philadelphia to allentown using I-476 north i was driving using cruise control and the car suddenly applied brakes ( phantom braking) there was no one either in front or sides or behind me. No accident happened
Additional Notes:
While driving on the highway from philadelphia to allentown using I-476 north i was driving using cruise control and the car suddenly applied brakes ( phantom braking) there was no one either in front or sides or behind me. No accident happened
Mfg Campaign: 11557036
Recall Date: Nov 27, 2023
Tesla, Inc.
Defect Description:
FORWARD COLLISION AVOIDANCE: ADAPTIVE CRUISE CONTROL
Potential Consequences:
While driving on the highway from philadelphia to allentown using I-476 north i was driving using cruise control and the car suddenly applied brakes ( phantom braking) there was no one either in front or sides or behind me. No accident happened
Corrective Action:
While driving on the highway from philadelphia to allentown using I-476 north i was driving using cruise control and the car suddenly applied brakes ( phantom braking) there was no one either in front or sides or behind me. No accident happened
Additional Notes:
While driving on the highway from philadelphia to allentown using I-476 north i was driving using cruise control and the car suddenly applied brakes ( phantom braking) there was no one either in front or sides or behind me. No accident happened
Mfg Campaign: 11557036
Recall Date: Nov 27, 2023
Tesla, Inc.
Defect Description:
SERVICE BRAKES
Potential Consequences:
Driving on the freeway with the adaptive cruise control set to 78 mph. Traveling in the #1 lane, medium traffic, no vehicles in front of us for a while (in our lane or the #2 lane next to us). Roadway was dry, sun was overhead at that time of the day. Weather said it was 72 outside, so not excessively hot. While nearly cresting the top of a slight incline, the car hard-braked, taking the speed from 78mph to 58mph for no obvious reason. (Thank heavens the car behind us was following at a safe distance and was able to brake, too.) There was no debris on the roadway or any other reason why the car would need to brake. Suffice to say that we deactivated the adaptive cruise control for the remainder of the journey.
Corrective Action:
Driving on the freeway with the adaptive cruise control set to 78 mph. Traveling in the #1 lane, medium traffic, no vehicles in front of us for a while (in our lane or the #2 lane next to us). Roadway was dry, sun was overhead at that time of the day. Weather said it was 72 outside, so not excessively hot. While nearly cresting the top of a slight incline, the car hard-braked, taking the speed from 78mph to 58mph for no obvious reason. (Thank heavens the car behind us was following at a safe distance and was able to brake, too.) There was no debris on the roadway or any other reason why the car would need to brake. Suffice to say that we deactivated the adaptive cruise control for the remainder of the journey.
Additional Notes:
Driving on the freeway with the adaptive cruise control set to 78 mph. Traveling in the #1 lane, medium traffic, no vehicles in front of us for a while (in our lane or the #2 lane next to us). Roadway was dry, sun was overhead at that time of the day. Weather said it was 72 outside, so not excessively hot. While nearly cresting the top of a slight incline, the car hard-braked, taking the speed from 78mph to 58mph for no obvious reason. (Thank heavens the car behind us was following at a safe distance and was able to brake, too.) There was no debris on the roadway or any other reason why the car would need to brake. Suffice to say that we deactivated the adaptive cruise control for the remainder of the journey.
Mfg Campaign: 11555919
Recall Date: Nov 19, 2023
Need Legal Help?
Featured Attorneys
Barry Edzant
Edzant Price LLC
Valencia, CA • 36 yrs
Focus: Lemon Law, Personal Injury
Recent Articles
Jeep Grand Cherokee Head Restraint Problem
Chrysler (FCA US, LLC) has announced a safety issue affecting certain 2023–2024 Jeep Grand Ch ...
Aug 1, 2025Fuel Injector Problem: 2021-2024 Ford Bronco Sport and 2020-2022 Ford Escape
If you own a 2021-2024 Ford Bronco Sport or 2020-2022 Ford Escape equipped with a 1.5L Dragon GTDI e ...
Jul 28, 2025Airbag Problem Affects 2022-2025 Chrysler Pacifica and Voyager Vehicles
If you own a Chrysler Pacifica or Voyager, your vehicle may be equipped with defective side curtain ...
Jul 24, 2025