Can I be Held Negligent if My Self-Driving Car Causes an Accident?
On the topic of automation, Frank Lloyd Wright said: “If it keeps up, man will atrophy all his limbs but the push-button finger.”
It is safe to assume that Mr. Wright had no idea how true his words would ring or how far technology would come.
In late May 2014, Google announced a new prototype for its self-driving car fleet that would have no steering wheel, gas or brake pedals, thus “designed to operate safely and autonomously without requiring human intervention.”
One-hundred percent of the driving would be undertaken by the car. The range finder mounted atop the car uses a laser to generate a 3D map of its environment, and sensors are placed on the car to remove blind spots and detect objects from a distance the equivalent of the length of two football fields.
Unlike an airplane operating on autopilot mode, which still requires the pilots to account for unanticipated objects in the sky and regain control of the aircraft when necessary, the “operator” of this model of self-driving car would not even have the opportunity to intervene in the driving of the vehicle.
Automated cars will solve several problems. For example, they will provide a mode of personal transportation for the elderly and those with physical disabilities. However, as so often happens with new technologies, they will likely create many new problems.
From a legal perspective, the most uncertain aspect of self-driving cars involves potential liability if a self-driving car is involved in an accident.
In a normal vehicular negligence situation, human error results in property damage, personal injury, or both, and the person at fault is responsible for paying for the damage. When informal resolution is not possible between the parties, litigation typically ensues.
For an injured party to recover civil damages from the party at fault, the injured party must prove: (1) the at-fault party had a duty to use reasonable care to prevent injury to others; (2) the at-fault party breached that duty; (3) such a breach was the actual and proximate cause of the other party’s injuries; and (4) the other party suffered damages.
Negligent behavior is usually easy to identify in a car accident scenario: someone ran a red light, drove in excess of the speed limit or recklessly weaved in and out of traffic. In each situation, the negligence analysis is straightforward. The driver had a duty to drive in a safe manner that did not put others at risk for injury. The individual drove in a dangerous manner, thereby breaching his duty. But for this dangerous driving behavior, the other party would not have been injured and it was foreseeable that driving in such a manner would likely cause harm to someone in the other driver’s position, like another driver on the road. If the other party was in fact damaged, the “at-fault” driver would be liable for negligence.
In contrast, the scenario involving a self-driving car is anything but cut and dry. Assume that the self-driving car involved in a car accident is the new Google prototype with no steering wheel or pedals, and that the automated car caused the accident. A person sits in driver’s seat but has no ability to operate the car, even if the individual desperately wants to regain control to avoid an accident.
Does the same duty from the normal accident scenario apply to a driver of an automated car?
It seems logical to say that a person sitting in the driver’s seat of a car has a duty to use reasonable care to prevent harm to others, but how can that duty be imposed on someone who cannot take any action whatsoever? Even if the driver felt that more caution was necessary, such as driving at a slower speed or moving over to avoid another reckless driver, there is nothing the person can do to override the autonomous nature of the car.
In that instance a court would be unlikely to impose a duty on a person to act a certain way when he or she is incapable of causing any change based on behavior. It would be akin to saying that Driver 1 who recognizes the careless driving of Driver 2 would be negligent for Driver 2’s behavior because Driver 1 acknowledged the dangerous behavior and did nothing to stop it. Driver 1 had no control over Driver 2’s car, and a court would never hold that driver accountable for Driver 2’s negligence.
One potential way to impart a common law duty on the owner of a self-driving car would be through a legal concept called res ipsa loquitur. When the res ipsa loquitur doctrine is employed, the injured party does not have to prove liability because the harm could only have occurred if the allegedly liable party was at fault. In other words, the law simply assumes that the only explanation for the incident is that negligence occurred.
However, a court is unlikely to apply res ipsa in the autonomous car context. If, for example, the accident is caused by a malfunction in the car, then a products liability issue is presented, not negligence. Because there would be uncertainty regarding why a self-driving car caused an accident, imposing the res ipsa presumption would be problematic.
Another possible way of imposing common law liability upon the owner of an autonomous vehicle is through strict liability. An individual can be held strictly liable for an injury caused by an inherently dangerous activity, even if that person took every reasonable precaution to prevent injury to another. Examples of such behavior include explosive blasting; transportation, storage or use of radioactive or hazardous materials; and the keeping of wild animals. However, it is unlikely that a court would consider use of a self-driving car to be an inherently dangerous activity, especially considering that the cars are designed to top out at a speed of 25 mph and are built with the purpose of decreasing the rate of car accidents. Any harm that results for a self-driving car’s accident will likely be minimal. As such, there is not likely to be anything inherently dangerous about this activity, and common law strict liability would not apply.
Yet another option for imparting negligence on the owner of a self-driving car would be through the passage of legislation creating liability.
Several states have “permissive use” statutes that impose liability on the owner of a car for the acts committed by a third party permissively using that car.
The California Vehicle Code, for example, holds every owner of a vehicle liable, up to a capped amount, for injury caused from a negligent or wrongful act or omission by any person using or operating the car with the owner’s express or implied permission.
In other states, however, such as Alabama, the owner of a vehicle is not liable for the negligence of a permissive user of that vehicle.
Any law passed relating to self-driving cars would be of a similar nature to the permissive use statutes – the owner of a vehicle, simply from his status as “owner,” would be liable for harm caused to a third party, even if he is not driving the car. The statute would essentially say, “The owner of any autonomous vehicle that does not allow for human intervention is strictly liable for any injury or damage caused by the vehicle, up to ‘X’ amount.”
If statutory or common law provides for liability of the owner of a self-driving car, the next question is whether insurers will provide coverage for such accidents. Will insurers start with a broad exemption on autonomous vehicles as part of an individual’s automobile insurance coverage? Or will they only provide coverage to the self-driving cars that allow a person to override and regain control of the car? Even though the surplus lines market will likely cover self-driving cars initially – until a loss history is established – asking these questions now will provide insurance companies with the ability to adapt quickly to the evolving technology.
It is no secret that the law moves slower than innovation. With self-driving cars predicted to be on the market by 2020, now is the perfect time for the legislature and the insurance industry to act in a pre-emptive manner. Swift action establishing the extent of liability that will be imposed on owners of self-driving cars involved in an accident will provide clarity in an industry that is constantly evolving and filled with uncertainties.