The National Highway Traffic Safety Administration (NHTSA) has intensified its investigation into Tesla's Full Self-Driving (Supervised) system, upgrading the probe to its highest level of scrutiny after identifying nine crashes linked to failures in reduced visibility conditions. This escalation, announced in March 2026, marks a critical moment for Tesla's autonomous driving ambitions and raises serious questions about whether the technology is ready for real-world deployment. What Exactly Is Going Wrong With Tesla's Full Self-Driving in Bad Weather? The core issue is straightforward but alarming: Tesla's FSD system struggles to recognize when its cameras can't see properly. When fog rolls in, sunlight glares off the road, or dust clouds obscure the view, the system doesn't always alert drivers that its vision has degraded. In crashes reviewed by NHTSA, "the FSD system did not detect common roadway conditions that impaired its visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred". The investigation, which began in October 2024, originally examined four reported crashes. That number has now grown to nine incidents, with one involving a pedestrian fatality. NHTSA also noted that Tesla's own "data and labeling limitations" may mean the actual number of similar crashes is even higher than reported. How Many Tesla Vehicles Are Affected by This Safety Issue? The scope of this investigation is massive. NHTSA's probe covers 3.2 million Tesla vehicles, including the Model S, Model X, Model 3, Model Y, and Cybertruck, all of which can use the FSD-branded driver assistance system. That's not a small subset of Tesla's fleet; it represents a significant portion of the company's vehicles on the road today. What makes this particularly concerning is that Tesla began developing a fix for these low-visibility problems back in June 2024, before NHTSA even opened the formal investigation. Yet as of the agency's latest filing, Tesla hasn't clearly communicated to regulators whether that fix was actually deployed or which vehicles received it. Why This Investigation Matters More Than Previous Probes NHTSA has upgraded this investigation from a preliminary evaluation to what's called an "engineering analysis," the agency's highest level of scrutiny. This distinction is crucial because it often precedes a formal recall order. The move signals that regulators have found enough evidence of a potential safety defect to warrant deeper technical analysis. This isn't Tesla's only FSD-related investigation. NHTSA is simultaneously probing more than 80 instances where Tesla's driver-assistance software violated basic traffic safety laws, such as running red lights. Combined, these investigations paint a picture of a system that may not be as reliable as Tesla's marketing suggests. Steps Regulators Are Taking to Understand the Problem - Crash Analysis: NHTSA has reviewed multiple incidents where FSD was active within 30 seconds of a collision, examining whether the system detected hazards and alerted drivers appropriately. - Data Requests: The agency is demanding detailed information from Tesla about the fix developed in June 2024, including deployment timelines and which vehicle models received updates. - Pattern Recognition: Investigators are looking for common threads in crashes, particularly instances where FSD lost track of or never detected lead vehicles in its path during reduced visibility scenarios. - Transparency Verification: NHTSA is pressing Tesla to clarify whether the company's reported data limitations have led to underreporting of similar incidents. In several reviewed crashes, a troubling pattern emerged: "the system either did not detect a degraded state, and/or it did not present the driver with an alert with adequate time for the driver to react". This suggests the problem isn't just about detection; it's about giving drivers enough warning to take over when the system's capabilities are compromised. What Does This Mean for Tesla's Robotaxi Dreams? The timing of this investigation intensification is particularly awkward for Tesla. CEO Elon Musk has staked the company's long-term valuation on autonomous driving technology, famously stating that Tesla's ability to develop self-driving capabilities will determine whether the company is worth "lots of money, or basically zero". Meanwhile, Tesla has been working to launch a robotaxi service in Austin, Texas, which would rely heavily on FSD technology operating without human oversight. If NHTSA determines that FSD has a safety defect, the agency could mandate a recall or require software updates across millions of vehicles. Such an outcome would undermine confidence in Tesla's autonomous driving capabilities precisely when the company is trying to convince regulators and the public that the technology is safe enough for driverless operation. Tesla did not immediately respond to requests for comment about the investigation. The company's silence, combined with the regulator's escalating scrutiny, leaves many questions unanswered about the timeline for fixes and the company's transparency with safety authorities. The investigation highlights a fundamental challenge in autonomous driving development: systems that work well in ideal conditions may fail catastrophically when real-world complexity increases. For Tesla and the broader autonomous vehicle industry, this probe serves as a reminder that marketing a system as "Full Self-Driving" carries enormous responsibility, and regulators are watching closely to ensure that promise matches reality.