Does Consumer Watchdog Founder Want Asimov-like Rules for Robot Cars?
A virtual grin arose on the face of Harvey Rosenfield with the mention of Isaac Asimov’s Three Laws of Robotics.
The mention was made during a phone call, but the overwhelming satisfaction of a smile was easily detectible in his voice during a conversation that had an otherwise ominous tone.
It was as if he had been waiting for someone to make a connection between a set of principles for autonomous vehicle technology he has authored and the famed robotics rules penned in 1942 by the science fiction author.
Rosenfield is a well-known consumer protection advocate and author of California’s long-standing and far-reaching auto insurance regulation, Proposition 103.
Like Asimov, Rosenfield’s intent is to issue a warning to the future. Asimov brought awareness to the potential impact of robots on society. Rosenfield is urging people to beware of the promise of safety and luxury that self-driving vehicles seem to hold.
Worrying about self-driving vehicles seemingly puts Rosenfield on the same page as the insurance industry, which some believe will be hurt by a decline in the need for personal auto insurance when and if self-driving cars are the norm on the world’s streets and highways.
But Rosenfield, founder of Consumer Watchdog, a Santa Monica, Calif.-based group that’s often at odds with the insurance industry, is not on the same page with the industry – nor the same ballpark. A rare détente between the two is far from reality.
In fact, his warnings about autonomous vehicles take aim at the insurance industry and auto manufacturers.
In a recent report he authored, “Self Driving Vehicles: The Threat to Consumers,” Rosenfield frets that John and Jane Driver will foot the blame – and often the bill – for autonomous vehicle mishaps, and that the insurance and auto industries see the advent of driverless cars as an opportunity to weaken consumer protections.
“The sparkly chimera of robots replacing human drivers – freeing people to spend their drive time more enjoyably and productively – has captivated the public and media, driven by self-interested auto manufacturers and software developers,” the report states. “But there has been very little public discussion of whether self-driving vehicles will coexist or collide with long-standing principles of accountability, transparency, and consumer protection that collectively constitute the Personal Responsibility System.”
The system the report refers to is the state-based liability and insurance laws.
Rosenfield in an interview with Insurance Journal to talk about his report said he’s worried about the impact on self-driving vehicles on consumers, particularly lower-income consumers.
“Let’s start with the auto industry,” Rosenfield said. “No. 1, of course, cars will become safer, hopefully, but the more automated they get, the more expensive they’ll be to fix, the more vulnerable they’ll be to hacking, the more costly they’ll be to buy, because this technology is not going to be cheap. The auto industry is probably going to end up marketing the self‑driving vehicles to very wealthy people, just the way that they do today with optional high‑priced equipment that only the richest people can afford.”
Rosenfield in his report argues that self-driving vehicles may be only as safe as people can afford and that the manufacturing and insurance industries are exploring ways in which they can limit or shift their responsibility, given that safety-related costs and claims are likely to increase as the result of the new technologies.
Wade Newton, a spokesman for the Alliance of Automobile Manufacturers, said self-driving vehicle technologies hold great promise to transform mobility for everyone.
“The Alliance supports policy initiatives that facilitate safety innovations and remove legislative and regulatory hurdles to the advancement of self-driving vehicles,” Newton wrote via email in reply to a request for comment for this story.
He added: “We have not reviewed this report, but given the fact that government figures show that driver behavior contributes to a full 94 percent of crashes, we can all agree that automated systems have the potential to further enhance road safety.”
Rosenfield believes the insurance industry will contend that since self‑driving vehicles are just around the corner, drivers aren’t going to be needed any longer, and “therefore, we don’t need consumer protections against insurance rip offs, fraud, and abuse that the voters passed back in 1988, when they passed Proposition 103,” he added.
As long as consumers can be blamed for a crash – whether it’s to be blamed by the manufacturer when something goes wrong with the car or by the software company that programmed the vehicle – they will still need to buy insurance coverage and they’re going to need the protections of Prop. 103, he said.
Prop. 103, passed by California voters in November 1988, requires prior approval from the California Department of Insurance before insurance companies can change property/casualty rates. Rate filings from carriers get CDI review as well as public review.
Rosenfield said that Prop. 103 eliminated a “whole host of discriminatory practices that the insurance companies like to engage in when they don’t want to sell insurance to people without elite occupations,” or higher education.
According to him, the industry “has been dying to get rid of Prop. 103,” and autonomous vehicles may be the industry’s opportunity to do that.
California’s insurance industry probably wouldn’t be upset to see Prop. 103 go. However, Rosenfield is overstating the role Prop. 103 plays in the world of insurance, according to Mark Sektnan, president of the Association of California Insurance Companies.
“It is also unfortunate that Mr. Rosenfield continues to misrepresent and misunderstand how the advent of driverless cars is going to change the insurance market and his comments do nothing to further the discussion about how we deal with a future that is almost here,” Sektnan said. “Sadly he seems locked in the last century and unable to adapt to the changing future. The world has changed a lot since the voters passed the initiative.”
Sektnan did acknowledge that he wouldn’t mind seeing Prop. 103 revamped or rethought as we enter a new world of driverless vehicles.
“It’s almost 30 years old and everything needs a review after 30 years,” Sektnan said. “I think it’s going to be very hard to fit a driverless vehicle into an insurance rating regimen that is based on a human driver.”
Driving record is one of the ratings factors mandated by Prop. 103. When issuing a policy for someone with an autonomous vehicle, which is it? The car’s driving record or the software maker’s?
“When you don’t have a human at the wheel what does driving record mean?” Sektnan said.
He said the industry has “been engaged in this issue” with the goal of ensuring there is appropriate liability.
Rosenfield believes the liability will fall in the laps of drivers.
“That’s what’s already happened,” he said. “If you look for example, what happened with the crashes involving Tesla. Tesla has consistently blamed the motorists, the owners of the car for the crashes. I project we see the future being the same as it is now. Whereas, if there’s a crash, the manufacturers aren’t going to step up and accept responsibility for it, they’re going to try to avoid responsibility, and blame the consumer, and fight it out.”
In one of the most watched crash incidents involving an autonomous vehicle, the U.S. National Highway Traffic Safety Administration in early 2017 found that the owner of a Tesla Motors Inc. Model S sedan that drove itself into the side of a truck in 2016 had ignored the manufacturer’s warnings to maintain control even while using the driver-assist function.
Intermingled in Rosenfield’s report, and throughout the IJ interview, he used the phrase “robot cars” liberally to the degree it would make one suspect he was trying to drive home a point. Is he?
“I am because one of the things that is very clear is that the software in these robot cars is going to take the place of human judgment, of human values of human morality,” he said. “That software is being written, guess who, by Google and other Silicon Valley companies, and eventually of course the auto companies will buy that software. We as consumers, we as the public, and we as human beings have no control over the software that’s being written. What happens when the software in the robot car detects that there are pedestrians that about to jump in front of the car for whatever reason – a baby carriage, a stroller accidentally rolling down the street?”
The car is going to have to make a decision that is now being made by human beings. Whether to drive into a stroller or drive into a tree?
“We don’t know what decision that the software is going to make,” Rosenfield said. “It’s going to be a life and death decision. Which way is it going to go?”
This is where a little light gets shined on Rosenfield’s “robot cars” phrase.
Asimov wrote his three laws of robotics to deal with decisions that the robots of his imaginary future were to face: 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Rosenfield in his report has offered his own version of these laws. They consist of six principles that he believes must be adopted to deal with the challenges of these “robot cars.”
Asimov’s three laws of robotics have been widely shared in popular entertainment, but it’s not clear that the ideas behind those laws – to protect people – are going to be shared, he said.
“That’s the philosophical conundrum that everybody’s going to have to reckon with,” Rosenfield said.
Rosenfield’s principles are as follows (These have been edited for brevity. See the full report and principles here):
Rosenfield offered up a bottom line on all of this.
“No.1, we think that we need to make sure that the industries involved respect our current human democratic values and cultural morals as they build these cars and deploy them,” he said.
Related:
- Q&A: How Driverless Car Technology Could Impact New York and Other States
- Insurers Say They Need Access to Data from Automated Car Systems
- Auto Insurance Market to Shrink by 70% by 2050: KPMG
- Automakers Seek Changes to Federal Autonomous Vehicle Guidelines
- How ‘Exposure Data Tracking’ Is Taking Over Personal Lines Insurance
- Will Technology Make Insurance Obsolete?
- Florida Businessman Pleads Guilty to Rolling Back Odometers by Thousands of Miles
- Palm Beach Revolt Forces Sylvester Stallone to Abandon Mansion Sea Barrier
- Three Dozen High-Rise Buildings in South Florida Are Sinking, Study Says
- Cleveland Clinic Plans New Hospital, Larger Outpatient Center in South Florida