The NHTSA (National Highway Traffic Safety Administration) claims that self-driving cars are 50% safer than human-operated cars. It makes sense that this claim would be true, after all, most car accidents are caused by human error which autonomous cars are incapable of having. However, there will always be a question as to who is liable in a car accident that involves one or more self-driving cars?
How Often do Self-Driving Cars Crash?
According to the Business Finance News, here have been cases of auto collisions involving self-driving cars, most of which involved Google’s self-driving cars. In all cases except one, the person who was not in the self-driving car was found to be at fault for the accident. (Fortunately, nobody was injured in any of the reported collisions involving self-driving cars.) The one instance in which Google’s self-driving car was found to be at fault demonstrates that the automated system in self-driving cars is not flawless. However, that autonomous vehicle had driven nearly 1.5 million miles without human aid successfully before the incident.
How is Liability Determined in a Self-Driving Car Crash?
When an autonomous car does crash, the local traffic laws still apply to determine fault. However, there are countless reasons for motor vehicle collisions and each time an autonomous car is involved, fault and liability will be debated. For example, if a person of an autonomous car knows that his car is malfunctioning, but decides to drive it anyway, liability might be allocated to either the driver or the company that made the car. It might be argued that the owner was negligent because he chose to drive a malfunctioning vehicle which could cause injury or damage to someone. On the other hand, the company that made the car may also be responsible for producing a product that malfunctioned and caused injury when used for its intended purpose.
Obviously, when self-driving cars cause accidents, each case will be part of a developing area of law. Some of the questions to be addressed will be similar to existing personal injury law. Here are just a few of the questions that will need to be answered in each individual case:
- Was there a defect with the design or manufacture of the self-driving car that led to the crash?
- Was the crash that the self-driving car caused foreseeable by the driver?
- Did the driver willingly drive his or her autonomous car knowing that the vehicle was malfunctioning and likely to cause harm to others?
- Could interference from a third party (such as a computer hacker) have caused the crash?
Another question relates to how insurance companies will react to autonomous vehicles? If the number of car accidents significantly decreases due to self driving cars, will auto insurance still be mandatory or will insurance rates drop? Will the companies that manufacture autonomous cars be required to carry motor vehicle insurance to cover the damages that their faulty cars could potentially cause?
Attorneys in California have already started to focus on this new area of law and have given themselves the title of “autonomous self-driving accident attorneys.” They are currently developing ways to assess liability in future personal injury cases. These attorneys are also evaluating the different types of evidence that would be the most useful in an autonomous car accident.
Liability is always a matter of context. The fact that self-driving cars have only just become available to the public means that the legislation and jurisprudence involving self-driving cars is still in its infancy. This could lead to an extensive and complicated trial if you were injured by a self-driving car. The interaction of both personal injury and product liability law with new technological advances is what makes autonomous cars such a legal mess. Only time will settle the matter of how to deal with autonomous car cases in court.