Zu viel Gebrauch von FSD macht dumm: Es gibt starke Halluzinationen wie Gegenverkehr und das Überqueren von Rotlicht. Nach mehr als 50 Unfällen wird Tesla untersucht.
What? Can AI lose its intelligence after long - term operation?
It sounds absurd but is very likely to be true. The biggest suspect currently is Tesla's FSD, which single - handedly initiated the wave of automotive intelligence, serves as a lighthouse in the intelligent assisted driving track, and is the key player in North American autonomous driving.
US regulatory agencies have just launched a new investigation, believing that FSD poses a risk of "becoming more stupid the more it's used".
If it's confirmed, over 2.88 million Teslas will face a recall - this is almost the total cumulative historical sales of Tesla in the US.
Is there really no qualified product among all the FSD versions of Tesla FSD ever sold?
What's wrong with Tesla this time?
Tesla just grandly pushed the FSD V14 version a couple of days ago, and Elon Musk was also publicly celebrating.
The NHTSA (National Highway Traffic Safety Administration) didn't give any face at all and immediately launched an investigation into FSD.
The NHTSA said that this investigation focuses on two types of problems. One is FSD running red lights illegally. There are many similar materials uploaded by users online:
The NHTSA confirmed receiving 18 complaints about red - light running accidents, and 4 of them resulted in injuries to one or more people.
Moreover, multiple accidents occurred at the same intersection in Joppa, Maryland. The NHTSA revealed that Tesla has taken corrective measures for this intersection -
Is it about the map or the rules? It's unknown, but this proves that at least before the V14 version, Tesla's FSD couldn't be fully data - driven and still needed manual optimization for specific scenarios.
The second situation is that FSD takes the wrong lane, such as suddenly driving into the oncoming lane in reverse, going straight in a turning lane, or driving onto the sidewalk:
In the second type of scenario, 18 complaints were also confirmed. All of them stated that FSD drove into the oncoming lane during or after a turn, crossed the double yellow line when going straight, or tried to drive into a no - entry road ignoring the reverse signs.
The official document states that including those reported voluntarily by Tesla and those from media tests... a total of 58 accident case reports of violating traffic safety regulations when using FSD were received, resulting in a total of 23 injuries.
Among them, the test conducted by the third - party vehicle testing agency AMCI Testing attracted special attention from the official and the media because they found that in multiple tests over a total of 1000 miles (1600 kilometers), FSD performed excellently without exception in the first 5 minutes and often made some handling actions that surpassed human drivers.
It seems foolproof, creating an impression for users that "FSD is very powerful", and then some users will unconsciously relax their supervision, taking their hands off the wheel or even their eyes off the road.
However, problems with FSD are often unpredictable. AMCI especially emphasized that the problems they encountered, such as running red lights and taking the wrong lane, almost always occurred after FSD had been running for some time.
It acts like an experienced driver at the beginning but becomes more stupid the more it's used.
In response to these problems, the NHTSA provided a detailed review, including evaluating whether the system alerts users about the actions it is about to take; whether it gives the driver enough response time; the ability of the FSD system to recognize traffic signals, display information to the driver, and make appropriate responses; and the ability of the FSD system to recognize lane markings and reverse signs and make responses.
It will also re - evaluate whether each OTA affects FSD's performance in complying with traffic laws and signals.
If problems are confirmed, all Tesla models equipped with FSD will be recalled. The official scope listed is 2,882,556 vehicles, almost all of Tesla's historical cumulative sales in North America.
The NHTSA's consideration standard is all models equipped with the FSD hardware package, not just limited to users who have activated the FSD service.
In fact, in almost every investigation involving Tesla, the official targets all delivered Tesla vehicles regardless of batches or years.
FSD's "Defect History"
Where might the problem be this time?
It's most likely the "hallucination" problem of the end - to - end model. The current mainstream solution in the industry is to strengthen the model's cognition through large language models, such as using external VLA/VLM models to explain and guide the system's trajectory output.
The newly updated FSD V14 also shows a similar ability to understand and recognize the environment. For example, it can drive autonomously on unpaved roads in private yards and finally park successfully in a garage full of sundries:
But for the NHTSA, it's difficult to pinpoint where the problems in previous accident cases exactly lie.
Due to the black - box nature of the end - to - end model, it's very difficult to directly adjust parameters. Tesla itself may not be able to accurately attribute and trace the source. It can only continuously improve FSD's capabilities at the overall level through the reconstruction of the technical system (transitioning from V14 to a one - stage end - to - end system).
So this investigation is likely to have little real impact on Tesla, just like all previous investigations.
There are currently 4 ongoing investigations, including an investigation in January this year into Tesla's remote vehicle movement accidents, an investigation in October last year into a series of accidents of FSD in bad weather conditions, and a recent investigation into whether Tesla's Robotaxi deployment is compliant...
The scope of each investigation covers all Tesla vehicles delivered throughout history.
But the NHTSA's regular investigation cycle starts from 18 months. The cycle will be even longer in complex situations. For example, there is still no conclusion regarding the fatal accident of a Tesla FSD user in 2023.
The conclusion of this investigation will come out 18 months later. By then, FSD V15 and V16 may have been launched.
This actually reflects the contradiction between the traditional regulatory process and the iteration law of AI technology. Essentially, Elon Musk is using the lag and window period of US policy regulation to accelerate the evolution of AI capabilities.
Previous problems may no longer be problems, but new problems may also arise, and these problems will be gradually solved in subsequent iterations... What Musk has to do is, on the one hand, support the AI team to keep exploring, and on the other hand, hire an excellent legal team to deal with the regulatory authorities and minimize the cost of operating in the gray area of the law.
Maybe one day, Musk will be able to solve the ultimate problem of AI in the physical world and bring all of humanity into a new era.
However, there is still a price to pay, and the question is who will bear it.
This article is from the WeChat official account "Intelligent Vehicle Reference" (ID: AI4Auto). Author: Jia Haonan. It is published by 36Kr with authorization.