For our most recent Enjuris Scholarship Essay Contest, we posed a controversial and hotly debated question to college students about how they think liability should be handled in the case of driverless car accidents. We thought this topic was particularly relevant in light of the fatal Uber self-driving car crash that occurred roughly one year ago on March 18, 2018 in Tempe, Arizona.
While it's doubtful that tragic stories such as this (or others) will stop the constant push to increase adoption of autonomous vehicles, here at Enjuris, we do believe it's worth pausing to reflect and discuss what lessons can be learned as well as explore how important legal issues might be addressed in the future.
Following the end of our scholarship entry period, the Enjuris team was faced with the daunting task of choosing just one winner of our $1,000 scholarship prize. However, we were thrilled to receive so many other thoughtful and well-written submissions that we wanted to share some of our favorites with you.
Continue scrolling below to read excerpts from some of the top submissions we received and learn where today's college students (and future lawmakers and game changers) stand when it comes to driverless car liability.
Essay question: With the rise of driverless car technology, lawmakers will have to tackle certain legal challenges. How do you think liability should be addressed in self-driving car accidents? Who's to blame when a self-driving car causes a crash?
Amberly Petersen, University of Phoenix (Scholarship Winner):
"Despite the leaps and bounds technology has made and despite the countless hours of safety testing, tragedy can still strike. When it does, there must be laws that allow us to apply the same standards to both humans and machines.
If an accident does occur in a vehicle that has been properly maintained and an override has not been made by the driver, the company that has produced such vehicle should be held liable for the damages that have occurred."
Krista Garcia, University of Central Florida:
"If the self-driving car was in an accident where another person was fault, such as running a red light, running a stop sign, or speeding, then it is not the self-driving car or the driver of the self-driving car who is to blame. This is why evaluating all facts of the case and treating it as any normal car crash is important because the answer and path to justice will not always fall to an autonomous car… When self-driving cars are involved in accidents or crashes, these situations should be treated on a case by case basis based on the way we evaluate car crashes in the eyes of the law today."
Emma Giovanoni, Thomas Jefferson University:
"In order to help determine who is at fault in any incident involving driverless cars, all automated cars should have a black box that records the surroundings and the car's reactions, as they have in airplanes, so the investigation team can tell what sensors might have been deactivated or if features had been overwritten, as well human reactions such as whether brakes were applied or the steering wheel was turned in attempt avoid the accident."
"Even in self-driving cars, drivers are supposed to remain alert and be able to take over control of the vehicle at a moment's notice. When the drivers are not able to immediately react to changing conditions, they should be held responsible. The technology, though advanced, is not yet ready to take over for humans entirely."
Brandon Girdley, University of Louisville:
"...the companies that design, build, and put self-driving cars on the road should be liable when such a car causes an accident. The tragedy of a death at the hands of a driverless car is the result of decisions made by those who put that car on the road. Ethically speaking, a self-driving car is nothing more than an extension of those who design it. Therefore, its decision-making process mirrors our own. While the reaction time and decision-making of an autonomous computer may be much faster than a human's, humans are still the ones responsible for designing such technology. And, by extension, humans are responsible for how that technology behaves in real-world scenarios."
Ayllip Khieu:
"...whoever caused the accident is at fault. So, if the other person was at fault, great. You and the autonomous car are innocent. However, if you and the autonomous car were at fault, it could go either way."
Rita Mormando, Loyola University:
"To blame the actions of the car on it's automatic features is not a reliable excuse. For example, pilots have clearance to fly on autopilot during most of the flight, but the person who is sitting in the pilot's seat is still responsible for the plane taking off, while it's in air, and landing safely to its destination regardless of its impressive features. If a plane should crash or not land at its designated stop the pilot would face serious consequences for the flight. That being said, I believe that this idea should be applied to people operating self-driving vehicles… I think that self-driving cars are an incredible invention and this idea of progress in this area of technology is outstanding, but it does come with some tough decisions. This feature is not an excuse for people to be lazy and let it do all the work for them. One must still pay attention and be alert while behind the wheel, and in the unfortunate event of a car accident the blame should be put on the person who's car caused the crash."
Alec Parent, University of Utah:
"In the scenario of a crash in a PURELY self-driving car, it would make sense that the company responsible for manufacturing the software that navigates the car is also responsible for the damage unless the passenger waived their right to sue in some complicated and obscure Terms of Service agreement. The real issues for me come with the current assisted driving technology in cars from companies like Ford and Tesla. If the ‘driver' engages an autopilot feature and has the ability to control the car at any moment, they should be responsible for their safety on the road just as if they were driving a normal car. But what if the car's autopilot system malfunctions?"
"Say the system crashes and gets stuck telling the car to continuously accelerate without bounds. Now there's a two-ton wrecking ball speeding down the highway just waiting until it hits and likely kills an unsuspecting bystander."
Victoria Parmentier, Valencia College:
"The responsibility of liability should fall solely on the owner of the self-driving vehicle whose car caused the accident whether it be driver error or a malfunction in the car itself… You cannot hold the manufacturers or car dealers reliable because in today's society [because] with regular cars we don't hold them reliable. On any given day we see tires blow out, brakes seize up, engines blow, and more that cause accidents. With these types of accidents we do not go after the car dealer who sold the car or the car manufacturer or parts manufacturer. The responsibility is still held to the driver, their car and their insurance. Why should self-driving cars be any different?"
Tashonda Rhule, Valencia College:
"At first, some people may be quick to assume that there are some sort of automatic loopholes associated with driverless cars and liability. In reality though, a driverless car is only just that, driverless. It doesn't mean there is nobody in the vehicle. Any person ‘behind the wheel' of a driverless car should be held to the same standards and laws of traditional vehicle operators (i.e. appropriate age, valid license, etc.)... if fault is placed on the driverless car, the passenger, or human, taking responsibility for the vehicle should be at fault. Additionally, any person that owns or operates a driverless car should be mandated to sign an agreement that essentially makes them liable for the driverless vehicle they choose to use."
"Think of how car insurance works: As soon as you purchase a vehicle, you must also buy car insurance. Part of this process involves assigning a ‘driver,' and also agreeing to certain circumstances involving other passengers or drivers of the same vehicle. Similar clauses should also pertain to driverless cars."
Cecilia Roessle, University of Arizona:
"In the case of an accident, the liability seems to shift from the driver to the car. Humans will be benefiting indefinitely with this new technology; they won't have to go through the work of driving nor be held liable within an occurrence of an accident… Sooner or later, human drivers will be banned from all streets and will be replaced by these avant-garde inventions."
Jamay Sidberry, Appalachian State University:
"The thing about driverless cars is that they are not completely driverless or without some sort of human guidance. The person in the driver seat of the car has input on certain decisions the car makes and can make commands. If a driver makes a command that someone handling the accident can identify as the cause of an accident, then I do think that it is the persons fault that [an] incident occurred. If the accident was a result of a movement that the car made with no direction, then it should be the fault of the maker of the car. In order to enforce this, I think that lawmakers should make it a requirement for driverless cars to be able to save all of their actions in their databases and whether or not it was a driver or car influenced action."
Kyle Smith, University of Arizona:
"When you are on the road and behind the wheel driving an automotive, you are responsible for your life and for the lives all of the friends and family members inside of that vehicle. Every left or right hand turn made on a busy street could have serious repercussions if not done correctly. Now imagine transferring that very responsibility solely to a computer system filled with millions of ones and zeros telling it where to drive. Quite terrifying, isn't it?"
"In the event of a self driving car causing an accident, liability must first fall into the owner's hands. The first question that must be addressed is if the owner is to blame. If the vehicle was not well maintained and does not meet the car manufacturer's guidelines, then the owner must be held accountable for the accident. If the vehicle's autonomous systems are well maintained, the liability should always then fall into the hands of the manufacturer. The manufacturer must assess what caused their vehicle to fail avoiding an accident. If a manufacturer is not confident that their autonomous vehicles will be fully effective at stopping potential car accidents, then it should not be on the road."
Riley Taylor, University of Kansas:
"The minute you decide to buy a car, it is your responsibility and liability. This should be applied even when you buy a self-driving car. Although it can be hard to determine who is to blame when a self-driving car causes a crash, the owner of the car should ultimately be liable… Just like with any accident, each self-driving car crash will be different and therefore, it is hard to determine who is to blame in a general sense. Each accident should be investigated to find out if the self-driving car was at fault, or a third party… By turning on the self-driving feature, the person in the car makes a decision that allowed the car to take control. This decision could make an accident at the hands of a self-driving car ultimately their fault."
Ashley Williams, Pepperdine University:
"Whether they were in control of their car or not at the time of the accident does not matter, it is their responsibility at all times. There should be no confusion here, when a self-driving car crashes and is at fault for creating the crash, the program failed, but also, the person in the car who is still responsible for remaining attentive also failed. Granted, there will be challenges for lawmakers in establishing responsibility in the car owner rather than the program creator or car manufacturer. However, if there comes a time when these cars are seen as beneficial as other safety features, then there will be no room for debate."
"[One] case to look at is the accident of Joshua Brown driving on autopilot while in his Tesla Model S. Joshua Brown was allegedly watching a movie while his car drove him from point A to point B, but he did not quite make it. A tragedy? Yes. The fault of the car, or anyone else for that matter? No."
Molly Wolf, University of South Carolina:
"...each scenario will need to be looked at with extreme care and concern for the people involved without immediate blame being placed on either party. If the accident involves one driverless vehicle and one standard car, the use of the technologies in the self-driving car would need to be implemented. Since driverless cars are programmed to not make mistakes, the most probable candidate is the human driving the traditional car; however, one cannot be sure without using the technologies in question. The self-driving car's sensor technology will be the most helpful in determining liability because it will be able to show what the other car did when it entered its radar, thus telling the story of the accident."
Jill Wywialowski:
"I would like there to be more ways to prevent accidents, but a self-driving car makes me think that there would be an increase of accidents with a software [malfunction] or something else that I could not control. How do we blame people for things that are out of their control? But by buying a self-driving car, you are taking that risk. Same as if you were to drive drunk, you are not completely in control and risking your life and others lives by your choice."
We'd like to thank all the students who participated! Be sure to visit our scholarship page soon for news about our next essay contest.