The first fatal accident linked to Tesla’s “Autopilot” automated driving technology raises questions about manufacturer liability as new autonomous driving technologies make their way to the marketplace.
A July 1, 2016, Los Angeles Times article asserts a view that Tesla Motors has introduced such technology more aggressively than its competitors. On one hand, this allows Tesla to gain a competitive advantage, while on the other it may make it more vulnerable to future liability litigation in accidents due to mechanical defect / malfunction.
Some commentators are critical of Tesla’s Autopilot concept, suggesting it gives motorists a false sense of security while still expecting them to quickly take over acceleration and braking if system sensors fail to spot a hazard.
Federal Investigation into Florida Fatality
The National Highway Traffic Safety Administration (NHTSA) is investigating the possible role played by Tesla’s Autopilot system in a fatal Florida collision on May 7 of this year, according to Bloomberg. Allegedly, the system sensors did not detect a white semi-tractor trailer against the bright sky as the truck pulled out in front of the Tesla. The trailer sheered off the top of the Tesla which was still reportedly traveling at regular highway speeds at impact.
The 40-year-old Tesla driver died when his vehicle went right under the trailer. The New York Times reports that the remainder of the car continued up the highway, apparently while still on Autopilot, until it finally veered off into a ditch.
Potential Future Litigation
In general, crashes allegedly linked to new technology sometimes involve a claim that other reasonable manufacturers would not have released the technology without more comprehensive testing. Any company rushing new technology to the marketplace faces scrutiny and potential liability lawsuits that may claim other manufacturers that did not do so acted in a reasonable and prudent manner while the given manufacturer performed in a negligent manner.
In a defective product case involving system software, one might focus on whether sensors and other system components interacted properly. For example, if the system could not detect the white tractor-trailer against a bright sky, as Tesla acknowledged in a public statement, then a question may arise as to whether proper testing would have exposed the problem.
Therefore, if a jury concludes an auto manufacturer did not act reasonably, it could be found guilty of negligence. Such conduct exposes a potential defendant to compensation claims for a variety of losses including, but not limited to, medical costs, pain, suffering and lost wages. When loss of life occurs, survivors often seek compensation for loss of companionship as well.
In the case of the fatal Florida accident, it is believed that any decision on the part of the family to file a lawsuit will await the results of the federal investigation into the incident. A question remains as to whether the trucker and his trucking company are liable, because the semi did pull out in front of the Tesla.
Autonomous Vehicle Classifications
NHTSA classifies automated driving systems using a five level classification concept:
- Level 0 – Driver is in control at all times.
- Level 1 – Certain individual controls are automated. Electronic traction control is an example.
- Level 2 – Two or more controls are simultaneously automated. For example, lane maintenance and adaptive cruise control could be designed to work together.
- Level 3 – Under certain circumstances, the driver can give up control of all safety-critical systems. Such a system determines when a driver must assume control of the vehicle, allowing him/her enough time to do so.
- Level 4 – All driving functions are automated. An occupant is not expected to take control at any time. Therefore, Level 4 autonomy is found in driver-less and even unoccupied vehicles.
Tesla’s Autopilot is classified as Level 2 system, although motorists have reportedly tried to push it to Level 3 performance. The 40-year-old Ohio motorist was allegedly known for pushing the limits of the system. He even posted a YouTube video illustrating the system in action.
The legal implications of alleged Autopilot misuse are unclear at this early stage. In the case of this first Tesla Model S fatality linked to its optional Autopilot, the circumstances are clouded further by the truck driver’s statement that he observed the motorist watching a movie in the moments before the crash, activity not consistent with Level 2 autonomy. Law-enforcement did indeed find a portable DVD player in the cabin, although the trucker’s claim has not been independently verified.
Responsibility in Crashes
The Los Angeles Times article raises questions regarding responsibility for autonomous vehicle crashes. It quotes the Consumers for Auto Reliability and Safety as suggesting that certain autonomous vehicle manufacturers are trying to have it both ways. That is, they may suggest that such a system will out-perform the motorist, but they insist the onus remains on the motorist if something goes awry.
Google, a company whose stated goal is to produce a Level 4, fully driver-less vehicle, takes a different approach. It says it will accept full responsibility for any accidents linked to any future failures of its Level 4, driver-less design.
When motorist or manufacturer negligence leads to injury or death, personal injury statutes typically apply. Therefore, it is often possible for the victim or survivors to seek compensation for medical expenses, pain, suffering and possible lost wages. In some states, and under certain circumstances, a jury may award punitive damages in addition to compensatory damages, typically when it finds evidence of gross negligence.
If you or a family member is a victim in an accident in which it is possible that an automated safety system did not perform properly, it is possible to consult with a personal injury attorney about the case. We do not charge for this consultation, and there is no obligation. Please contact us at your convenience to ask your questions or to arrange a visit.