HomeArticle

Tesla Ordered to Pay $1.7 Billion in US in Autopilot Fatal Crash Case, with Both Automaker and Owner Held Responsible

智能车参考2025-08-04 16:32
Tesla appeals: The car owner should bear full responsibility.

A court case that has shocked the United States and the world.

Due to a fatal accident involving Tesla's Autopilot (the predecessor of FSD), Tesla is facing a huge compensation of $1.75 billion.

In 2019, a Tesla owner was driving a Model S and got distracted while using Autopilot. As a result, the car crashed into a roadside vehicle at high speed, causing one death and one injury. The deceased was only 22 years old.

After the incident, the family members of the deceased and the survivor sued Tesla and the owner. The jury ruled that the driver and Tesla should jointly bear the responsibility for the accident.

However, Tesla does not accept this result. It still follows its previous official logic: the owner used the system improperly, and Tesla, the provider of Autopilot, should not be held responsible.

Tesla is not satisfied and has clearly stated that it will appeal further.

In fact, the reason why this verdict has caused a sensation is not only because of the huge compensation of $1.75 billion, but more importantly, once it becomes a precedent, it may become the straw that breaks the camel's back for Tesla.

Whether it was Autopilot before or FSD now, Tesla cannot take full responsibility for the accidents.

Tesla Ordered to Pay $1.75 Billion Due to Fatal Accident

After nearly a month of trial, a jury in Florida, the United States, finally ruled that Tesla should bear one - third of the responsibility for a fatal car accident in April 2019, while the responsible driver should bear the remaining two - thirds. The two parties should jointly compensate the family members of the deceased and the survivor $329 million.

The compensation is divided into two parts. $129 million is compensatory, and $200 million is punitive. Originally, according to the responsibility division, Tesla's total compensation would only be more than $100 million. However, since the responsible driver has reached a settlement with the plaintiff and is no longer a defendant, and cannot be selected as the target of punishment, therefore, the entire $200 million in punitive damages is imposed on Tesla.

Adding one - third of the compensatory damages, the final compensation amount that Tesla needs to pay becomes $242.5 million, approximately equivalent to 1.75 billion yuan. This is a very large amount, which can buy 2,425 Model S cars in North America, equivalent to 20% of Tesla's net profit in the second quarter of this year.

So far, this six - year - long case has finally made substantial progress. The announcement of the compensation verdict has caused a sensation. Some media even described this verdict as a "rare victory", after all, this is the first time that a lawsuit over Tesla's assisted driving accident has entered the trial stage.

However, everything is not yet settled. The huge amount and far - reaching impact of this case will not be easily resolved. Tesla has appealed the verdict, believing that "the verdict is wrong" and "there were illegal and irregularities in the trial process", and stating that no vehicle could have prevented this accident from happening from 2019 to now.

What kind of accident could make Tesla reach such a conclusion?

The Fatal Accident Six Years Ago

On April 25, 2019, when a Model S owner was driving through an intersection, his phone accidentally fell to the ground. The owner bent down to pick up the phone. At this time, the vehicle had activated Autopilot and was in the assisted driving state. Then it ran through a stop sign and a traffic light at a speed of about 100 km/h, crossed the intersection, and finally crashed into a Chevrolet SUV and two car owners standing on the other side.

The two car owners were a couple. The woman was thrown 23 meters after the impact and died on the spot. She was only 22 years old at the time. The man narrowly escaped death, but suffered multiple fractures, head injuries, and psychological trauma.

The above is the version of the accident reported by the media. After the incident, the family members of the deceased and the survivor sued both Tesla and the owner. However, they later reached a settlement with the owner, and Tesla became the sole defendant.

So Tesla had to respond. In its response, some more details emerged about this accident:

The Model S owner was speeding and had his foot on the accelerator, which would make Autopilot ineffective. Moreover, he was looking for his phone in the car and not looking at the road.

It is unknown to the outside world how Tesla obtained the vehicle data at the time of the accident and whether it had the owner's permission to do so.

In short, in Tesla's view, this accident had nothing to do with Autopilot, and the owner should bear the responsibility. The local traffic police also pointed out that the owner was driving recklessly.

However, the plaintiff and their lawyer believe that Tesla designed Autopilot only for highways but deliberately did not restrict users from using it on other roads. Moreover, Elon Musk claimed that Autopilot was better than human driving. Tesla's relevant descriptions made people like the Model S owner overly trust the system, ultimately leading to the car accident.

Description of Autopilot function, Source: Tesla official website

After days of confrontation and trial, the jury finally found that both the perpetrator and Tesla were responsible.

Both parties should be held accountable.

However, even so, for Tesla, once this becomes a precedent, it will be almost a disaster.

Because this case is not an isolated one. Tesla has recently resolved multiple assisted driving cases, usually ending in a settlement between the two parties. This is the first time that the two parties have finally gone to court, and a verdict unfavorable to Tesla has been made.

Although Tesla still has the opportunity to appeal, this verdict will obviously serve as a reference.

Of course, this is not a challenge for Tesla alone, but a collective challenge for the entire "intelligent assisted driving" industry. Now it's a situation of all for one and one for all.

After an accident involving a certain car company in April this year, the public in China has begun to pay more attention to safety.

On the one hand, the publicity and launch of relevant functions are strictly regulated. For example, terms such as "high - level intelligent driving" and "intelligent driving" are gradually disappearing and being replaced by assisted driving.

In terms of software updates, OTA upgrades have more detailed product access and recall requirements.

In terms of user education, it starts from driving schools. Relevant departments are considering including the operating specifications of assisted driving in driving school training and even as exam knowledge points.

With the rapid popularization of assisted driving, car companies have the responsibility and users have the right to understand the boundaries of the system's capabilities, because this is not only related to the safety of system users but also directly affects the safe travel of each and every one of us.

"Improper use by the owner" is a simple statement, but the consequences of improper use are too heavy.

Reference Links:

https://www.reuters.com/legal/litigation/tesla-ordered-by-florida-jury-pay-243-million-fatal-autopilot-crash-2025-08-01/

https://www.cnbc.com/2025/08/01/tesla-must-pay-329-million-in-damages-in-fatal-autopilot-case.html

https://electrek.co/2025/08/01/victims-of-tesla-autopilot-crash-are-seeking-345-million-in-damages/

https://www.tesla.com/support/autopilot

This article is from the WeChat official account "Intelligent Vehicle Reference". Author: Yifan. Republished by 36Kr with permission.