top of page

Tesla enters critical trial over fatal Autopilot crash

Tesla is going into a critical trial today over a death that reportedly happened in an Autopilot crash. The result of the trial could have an impact on the legal outcomes of other similar incidents.


Starting today, the trial is taking place in California’s Riverside County Superior Court over a civil lawsuit brought by Micah Lee’s estate against Tesla.


It revolves around an accident where Micah Lee’s Tesla Model 3 veered off the highway near Los Angeles, crashed into a palm tree, and caught on fire.


Lee died and his two passengers were seriously injured.


The passengers and Lee’s estate are claiming that a defect in Autopilot is responsible for the accident and that Tesla knew that there was a problem with its system – making the automaker responsible for the accident.


On the other hand, Tesla is arguing that driver attention is the problem and also noted that Lee had alcohol in his blood – even though it was under the legal limit.


The automaker also disputes the fact that Autopilot was engaged during the crash. However, Tesla has previously claimed that when Autopilot disengages itself moments before a crash.


Tesla has won similar cases in the past, but this one is a higher profile since it unfortunately involves a fatality.


It could also create a precedent for Tesla and other Autopilot accidents as the argument that Tesla warns drivers that the responsibility is with them will be tested again.


The opening statements are expected in court today, and the trial could last over a week.


Electrek’s Take

Personally, I have yet to see clear evidence that Autopilot or FSD is responsible for a major Tesla accident, but that’s if you accept the parameters that Tesla clearly warns drivers to keep their hands on the steering wheel and be able to take control at all times.


In virtually all the cases I’ve seen, drivers had time to correct any Autopilot or FSD Beta behaviors.


However, lately, I’ve seen a few things from FSD Beta that are worrying. The relatively new behavior of FSD Beta moving to the left of a lane when passing a truck has resulted in some small incidents and near-misses that would be hard to blame on the drivers.


Also, I reported on a serious problem with the latest FSD Beta update earlier this month that tried to drive me off the road twice in a few minutes. I was using the system as intended, with my hands on the wheel and eyes on the road, but even then I was barely able to take control in time.


Therefore, I think there’s room for Tesla to be tested in court with the liability of its system, but I don’t know enough about this particular case to know if it is warranted. We will have to follow the court case and see.

Comments


bottom of page