Regulators Investigating Fatal Collision of Tesla on Autopilot

July 1, 2016 by and

U.S. regulators are investigating a fatal accident involving a Tesla Motors Inc. sedan that was driving on autopilot, drawing scrutiny to a key technology the electric-vehicle maker is betting on for the future of self-driving cars.

The crash involved a 40-year-old Ohio man who was killed when his 2015 Model S drove under the trailer of an 18-wheeler on a highway near Williston, Florida, according to a Florida Highway Patrol statement. Shares in the automaker fell 2.7 percent to $206.50 in late trading.

The details of the accident are likely to add fuel to the debate over whether self-driving cars are ready for the real world. Autopilot didn’t notice the white side of the tractor trailer against a brightly lit sky, so the brake wasn’t applied, said Tesla, which reported the May 7 incident to National Highway Traffic Safety Administration. In a blog post, Tesla said that the fatal crash is the first known fatality in more than 130 million miles of Autopilot driving.

If the Autopilot system didn’t recognize the tractor trailer, then Tesla will have to recall the cars to fix the flaw, said Clarence Ditlow, executive director of the Center for Auto Safety, an advocacy group in Washington. Ditlow said that Tesla’s Autopilot system needs to be able to recognize all possible road conditions.

“That’s a clear-cut defect and there should be a recall,” Ditlow said in a phone interview. “When you put Autopilot in a vehicle, you’re telling people to trust the system even if there is lawyerly warning to keep your hands on the wheel.”

Tesla said in the post on Thursday that it requires specific knowledge from the vehicle owner that Autopilot “is new technology and still in public beta phase” before it will enable the system. No other automaker sells unproven technology to customers, said Eric Noble, president of CarLab Inc., a consulting firm in Orange, California.

“There’s not an experienced automaker out there who will let this kind of technology on the road in the hands of consumers without further testing,” Noble said in a phone interview. “They will test it over millions of miles with trained drivers, not with consumers.”

Tesla, on a mission to fulfill Chief Executive Officer Elon Musk’s revolution in sustainable transportation, has had a rocky year. Shares fell 40 percent by Feb. 10 on concerns about production of the Model X sport utility vehicle, then rose 85 percent in the next two months on enthusiasm over the smaller Model 3 sedan, which generated 373,000 reservations accompanied by $1,000 deposits.

Just in the last month, NHTSA asked the youngest and smallest publicly traded U.S. automaker for information about its suspension systems following a report by the Daily Kanban blog. The government characterized the inquiry as a routine data collection, and Tesla insisted there’s no safety defect. The company did, however, revise its so-called Goodwill Agreements to make it clear that customers are free to report safety concerns to regulators.

Then Tesla shares fell more than 10 percent on June 22 the day after announcing a proposal to acquire SolarCity Corp., the rooftop solar company that also counts Musk as its chairman and largest shareholder. The stock declined a total of 12 percent this year through Thursday.

Tesla has always prided itself on its safety record. In August 2013, the Model S sedan was awarded a 5-star safety rating by the National Highway Traffic Safety Administration. The company’s website states that “Model S comes with Autopilot capabilities designed to make your highway driving not only safer, but stress free.”

“What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S,” Tesla said in the post. “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.”

News of the investigation comes as the world’s automakers are making increasing forays into self-driving features, technology that is built on the promise of saving lives.

In its statement, NHTSA said it sent a special crash investigation team to the scene, a step the agency reserves for accidents that represent emerging areas of interest. The safety agency said it will examine the design and performance of the automated driving systems in use at the time of the crash.

NHTSA emphasized in its statement that its preliminary evaluation of the incident doesn’t indicate any conclusion about whether the Tesla vehicle was defective.

Tesla said that the “customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends.”

The victim, Joshua Brown, had posted videos on YouTube demonstrating the ability of Autopilot to avoid accidents. An online obituary said he was a former Navy SEAL.

Tesla began rolling out its Autopilot features in October. Autopilot is a step toward autonomous or self-driving cars, and includes features like automatic lane changing, auto steering and the ability of the vehicle to parallel park itself. In release notes about the software updates that are sent to owners, Tesla stresses that drivers still maintain responsibility for safe driving and should keep their hands on the wheel at all times.

“Similar to the autopilot functions in airplanes, you need to maintain control and responsibility of your vehicle while enjoying the convenience of Autopilot in Model S,” Tesla has said.

Self-driving and semi-autonomous cars have a good track record so far, but they aren’t perfect.

In February, a Lexus-model Google self-driving car hit the side of a bus near the company’s Silicon Valley headquarters. The vehicle was in autonomous mode going about 2 miles per hour around sandbags in the road. Google’s software detected the bus but predicted that it would yield, which it did not, according to a company report about the incident. There were no injuries reported at the scene, the company noted. “In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision,” Google said in its report.

Related: