Self-driving cars have been depicted in movies for years. But there’s one thing those futuristic movies like Minority Report don’t address:
Who’s liable when a self-driving car crashes?
The fatal car accident involving Jovani Maldonado may soon shed some light on that question.
Let’s take a look at the accident and the liability issues surrounding self-driving cars.
The accident
Benjamin Maldonado and his 15-year-old son, Jovani, were returning home from a soccer tournament in California on August 24, 2019, when a tractor-trailer slowed in front of them.
Benjamin put on his turn signal and started to change lanes. He suddenly noticed a Tesla Model 3 approaching fast. He tried to swerve back into his lane but the Tesla, traveling more than 65 miles per hour on Autopilot, rear-ended his truck before he was able to do so.
Jovani, who was not wearing a seatbelt, was ejected from the pickup truck. Tragically, he died at the scene of the accident.
“We are living day by day,” Benjamin told the New York Times. “There is so much sadness inside. We take family walks and try to do things together like going to church. There is a massive hole in the family.”
Tesla’s Autopilot system is supposed to detect obstacles and slow down or stop automatically when appropriate, but data retrieved from the Tesla involved in the August 24th accident shows that neither the Autopilot system nor the driver slowed the vehicle until a fraction of a second before impact.
In fact, the data shows that the Tesla briefly increased from 69 miles per hour to 70 miles per hour before slowing in the final fraction of a second before the crash.
The lawsuit
Shortly after the accident, Benjamin Maldonado filed a wrongful death lawsuit on behalf of his son in Alameda County Superior Court.
Benjamin’s lawsuit alleges that Tesla’s Autopilot was defective because it failed to react to traffic conditions. The lawsuit further alleges that the driver of the Tesla, Romeo Lagman, was negligent for failing to pay attention behind the wheel and slow down in time to avoid the accident.
Who’s liable when a self-driving car crashes?
The question of who should be held liable—the manufacturer or the driver—when a self-driving car causes an accident has been the topic of much debate (so much so that it was also the topic of our 2019 college scholarship essay).
On the one hand, manufacturers have a legal duty of care to prevent injuries or damage arising from the failure of their products. If a product suffers from a design defect, manufacturing defect, or marketing defect, the manufacturer may be held liable for any injuries that result. This type of lawsuit is called a product liability lawsuit.
In such cases, manufacturers may raise the defense of comparative negligence (for example, the manufacturer may argue that the driver interfered with the automated processes and caused the accident) or product misuse (for example, the manufacturer may argue that the driver disregarded directions or altered the vehicle).
What’s more, as vehicles become more reliant on technology, the potential for hacking a car system increases. Accordingly, manufacturers may argue that third-party party hackers caused the crash.
On the other hand, all drivers have a duty to exercise reasonable care to avoid injuring others on the road. A driver who fails to exercise reasonable care may be held liable for any accident that results. This type of lawsuit is called a personal injury lawsuit based on negligence.
To complicate things even more, some states have statutes that place certain requirements on self-driving car manufacturers and drivers. These statutes may impact liability.
For example, California Code 38750 states that:
“The driver shall be seated in the driver’s seat, monitoring the safe operation of the autonomous vehicle, and capable of taking over immediate manual control of the autonomous vehicle in the event of an autonomous technology failure or other emergency.”
And:
“The autonomous vehicle has a system to safely alert the operator if an autonomous technology failure is detected while the autonomous technology is engaged.”
Practically speaking, if you’re injured in a self-driving car accident, your lawyer is likely to sue both the manufacturer and the driver of the vehicle.
Other lawsuits involving self-driving cars
At least 10 people have been killed in accidents involving self-driving cars since 2016, according to reports from the National Highway Traffic Safety Administration. A number of these accidents have resulted in lawsuits. For example:
- Jeremy Banner, a 50-year-old Florida resident, was driving a Tesla Model 3 with the Autopilot system activated when his car struck an 18-wheeler. The man was killed and a subsequent investigation found that neither the driver nor the Autopilot made any attempt to evade the accident. The man’s family filed a wrongful death lawsuit that is currently pending.
- Elaine Herzberg was walking a bicycle across the road at night when she was struck and killed by an Uber self-driving Volvo SUV. A subsequent investigation found that the car’s driver was streaming an episode of the television show The Voice at the time of the accident. Uber settled the case for an undisclosed amount.
Although still in their infancy, self-driving cars will become increasingly common in the coming years. Lawsuits, like the one filed on behalf of Jovani Maldonado, will go a long way in clarifying who is likely to be held liable when a self-driving car crashes.
Emerging Technology and Personal Injury:
A Brave New World
Self-driving cars, AI, and wearable tech are everywhere. But neither technology nor humans is infallible and sometimes injuries happen.