In the “olden” days, you could call for an Uber and a car (with a driver) would pick you up and transport you to your destination. By “olden days,” we mean back when Uber launched in California in 2009.
Today, in certain areas of the country, Uber offers self-driving car technology, which is part of the company’s Advanced Technologies Group (ATG). The way Uber describes it, a car’s self-driving system features overlapping sensors that gather data in the entire 360-degree area around the vehicle. Combined with maps and other data, it’s a “complete representation” of the space around the vehicle. The software can predict what is about to happen on the road in order to safely navigate an area.
Wait... wasn’t a benefit of Uber that it was a good side hustle for people wanting a little extra cash?
Well, yes.
But from a corporate perspective, paying drivers is the biggest expense. Especially for services like Uber Eats, the company cuts down on a lot of costs by using self-driving cars. Therefore, beginning in early 2015, Uber self-driving cars made their appearance.
Though Uber touts the safety of self-driving cars, at least 1 fatality has already occurred that involves these vehicles.
Arizona self-driving car fatality
In March 2018, Elaine Herzberg was walking a bicycle across the road at night. She was struck and killed by an Uber self-driving Volvo SUV. The vehicle was in computer control mode at the time of the crash. The driver, Rafaela Vasquez, has been charged with negligent homicide.
Vasquez was reportedly watching a TV show on her phone at the time of the accident.
The crash investigation found that for nearly ⅓ of the trip, Vasquez was looking at her phone, which she’d placed toward the bottom of the car’s center console. The report says that if she’d been attentive, she “would likely have had sufficient time to detect and react to the crossing pedestrian to avoid the crash or mitigate the impact.”
What is automation complacency?
Automation complacency is when a manual task competes with an automated task for the operator’s attention. In this instance, it would be a visual distraction that competes with the task of driving.
In other words, sometimes an over-reliance on technology or software causes a person to disengage from a specific task — otherwise known as cognitive detachment.
In addition to visual evidence that Vasquez was looking at her phone, records from Hulu indicated that she was streaming The Voice at the time of the crash.
The federal investigation by the National Traffic Safety Board (NTSB) into the crash that killed Elaine Herzberg determined that Uber “did not adequately recognize the risk of automation complacency and develop countermeasures to control [driver disengagement].”
Vasquez’s job with Uber was as a “safety backup driver.” Unlike a “regular” Uber driver, who would actively drive a rider to their destination, a backup driver is present to do exactly what Vasquez didn’t — to ensure that the car is traveling safely, both for riders and for other road users.
Who’s liable in a self-driving car accident?
In the Herzberg case, there’s both criminal and civil liability involved. The defendant, Vasquez, has been charged with negligent homicide. If she’s found guilty, she will have a criminal sentence that can include jail time.
It has already been determined that Uber is not criminally liable in this case, but the company reached a civil settlement with Herzberg’s family.
Uber’s liability for Herzberg’s death was related to the finding that the vehicle’s sensor systems couldn’t determine that Herzberg was a person, vehicle, or bicycle, and it didn’t correctly predict her path. In fact, in the 6 seconds prior to impact, the system classified her first as an unknown object, then as a vehicle, then as a bicycle. The system did identify that an emergency braking maneuver was necessary, but it had been configured not to activate emergency braking while under computer control.
That means the human operator, Vasquez, was responsible for taking control of that situation. Vasquez did not brake until after colliding with Herzberg.
It should be noted that Arizona has a comparative negligence law. An injured plaintiff may recover damages if they are up to 99% at fault for the accident, but their damages are reduced according to their percentage of fault.
In this case, Ms. Herzberg walked her bike across a dark road, which certainly contributed to her injuries. Presumably, her actions could have reduced the amount her family was able to recover in the undisclosed settlement with Uber after her death.
Doctrine of respondeat superior
In Latin, respondeat superior means “let the master answer.”
What it means in the law is that a corporation can be liable for the actions of its employees. The theory is that an employer receives benefits from the work an employee provides and therefore may control how an employee acts on behalf of the business. Sometimes, it also means that a deep-pocketed corporation is able to pay for damages in a civil lawsuit when an employee, personally, couldn’t.
However, a corporation can’t take on the criminal liability of an employee…it can’t go to jail on an employee’s behalf.
This is important in the Herzberg case because Uber can be sued in a civil lawsuit for Herzberg’s wrongful death, and Vasquez can be charged criminally as related to distracted driving.
During the accident investigation, the NTSB also determined that the Arizona Department of Transportation did not sufficiently oversee automated vehicle testing. The NTSB concluded that this was also a cause of the crash.
So, if you’ve lost track of potential civil defendants, here we are:
- Uber
- Arizona Department of Transportation
- Rafaela Vasquez (who has been criminally charged, but that doesn’t prevent her from being sued in a civil lawsuit)
Moving forward
Elaine Herzberg was the first person to die in an accident involving a self-driving car, but she likely won’t be the last.
If you’re ever engaged as a driver of a self-driving vehicle, or as a passenger, or if you’re involved in an accident with one, it’s important to know the legal obligations related to each of those roles.
If you’re a road user (motorist, pedestrian, or bicyclist) who’s involved in an accident with this type of vehicle, you should know that it’s going to be a complicated claim because of all the liability issues. It will be especially important that witnesses or law enforcement are able to provide accurate accounts of how the accident happened.
If you become a driver for Uber or any other company that uses self-driving vehicles, know your responsibilities ahead of time. A self-driving car doesn’t absolve you of liability in a crash — you need to be aware of your surroundings and remain alert just as you would if you were driving a regular car.
These are going to be complicated cases, in part because there are so many parties and factors involved. The other reason is that these are largely uncharted waters. Anytime cases arise that involve new technologies and new laws that allow them, there are going to be legal nuances that weren’t anticipated.
If you’re involved in this (or any) type of car accident, it’s crucial to call a lawyer who can represent your interests. The Enjuris law firm directory lists attorneys in your area who specialize in various types of personal injury cases.
USDOT's New Regulation:
AEB Systems Required in Cars
by 2029
New mandate for Automatic Emergency Braking (AEB) systems in cars by 2029 aims to reduce accidents and enhance driver protection through advanced technology.