Car technology has come a long way, and with some car makers, such as Tesla, boasting about their vehicles having self-driving capabilities. However, the possibility of driverless technology has raised questions of liability in the event of an autopilot crash. Some experts believe vehicle manufacturers should be responsible for accidents that result from a malfunction of a feature they claim should work.
The question of liability gained quite some traction following the charge of a Tesla Model S driver with manslaughter for an accident that occurred while the vehicle was in autopilot mode. Kevin George Aziz Riad was charged with two counts of vehicular manslaughter with gross negligence following the deaths of Gilberto and Maria Lopez.
Autopilot Crash: New Standards for Liability
Gilberto’s family filed a lawsuit in Los Angeles in 2020. In the case, Gilberto’s family claimed that Riad’s vehicle was driving at an excessively high speed while on autopilot when the accident occurred. The car’s autopilot feature is designed to steer, accelerate, and brake on its own. The NHTSA confirmed that autopilot was on when the accident happened, indicating that the feature may have failed. Prosecutors failed to release further details about the case.
Charges against Riad are the first for accidents involving vehicles with semi-autonomous driving technology. Besides being the first, the case highlights the dangers of over-reliance and abusing technology, and creates the standard for liability for drivers of semi-autonomous cars.
“Riad’s reckless accident lawsuit stands on a situational concept of negligence. Regardless, each driver has a duty of care and responsibility for everything that happens on the road, Riad’s negligence case stands on a situational concept of negligence, “says car accident attorney Rustin Smith of Smith Hulsey Law. According to attorney Rustim Smith, the court will look at what was reasonable given the circumstances. Michael Brooks, the Center for Auto Safety’s chief operating officer, hopes drivers of semi-autonomous vehicles will look at Riad’s case and realize that autopilot has limitations. According to Brooks, the semi-autonomous feature is not meant to take drivers from point A to B. Drivers need to be responsible for everything that pertains to a vehicle’s operation.
Semi-Autonomous Not Self Driving
Other individuals like Jonathan Handel of the University of Southern California Gould School of Law believe that while autonomous features should not be viewed as a replacement for the driver’s responsibility in controlling their vehicle, Tesla should also be held liable for the deaths. Tesla has not responded to requests to address the issue but argues that their S Model meets all statutory safety standards. The carmaker says the feature is meant to assist the driver and requires active driver supervision on its website.
According to Brooks, Tesla’s use of the word autopilot could lead people to have high expectations from the feature, resulting in catastrophe when it fails. Brooks reiterates that the feature is only an advanced driver assist feature, not a self-driving feature.
It Has Happened Before
This is not the first instance a Tesla car on autopilot has caused an accident. In 2019, a Florida-based finance executive was driving his Tesla on autopilot when he reached down to pick his phone from the car’s floor. The driver must have believed the autopilot feature would stop the car at a stop sign or red light, which did not happen. It went straight through, crashed into another vehicle, and killed its occupant, a 22-year-old college student.
The misconception that semi-autonomous means self-driving has resulted in many drivers engaging in risky driving behavior as a way of trying out the car’s capabilities. Unfortunately, a malfunction in the system can be fatal.