By Associated Press
LOS ANGELES — Federal safety regulators are sending a team to California to investigate a fatal freeway crash involving a Tesla, just after authorities near Oakland arrested a man in another Tesla rolling down a freeway with no one behind the steering wheel.
Experts say both cases raise pressure on the National Highway Traffic Safety Administration to take action on Tesla’s /zigman2/quotes/203558040/composite TSLA -0.04% partially automated driving system called Autopilot, which has been involved in multiple crashes that have resulted in at least three U.S. deaths.
The probe of the May 5 crash in Fontana, California, east of Los Angeles, is the 29th case involving a Tesla that the agency has responded to.
“The details of whether the Tesla was in autonomous mode are still under investigation,” Officer Stephen Rawls, a spokesperson for the California Highway Patrol, said in an email Wednesday.
The Tesla driver, a 35-year-old man whose name has not been released, was killed and another man was seriously injured when the electric car struck an overturned semi on a freeway. The injured man, a 30-year-old passing motorist, was struck by the Tesla as he was helping the semi’s driver out of the wreck.
“We have launched a Special Crash Investigation for this crash. NHTSA remains vigilant in overseeing the safety of all motor vehicles and equipment, including automated technologies,” the agency said in a statement Wednesday.
The investigation comes just after the California Highway Patrol arrested another man who authorities say was in the back seat of a Tesla that was riding down Interstate 80 with no one behind the wheel.
Param Sharma, 25, is accused of reckless driving and disobeying a peace officer, the CHP said in a statement Tuesday.
The statement did not say if officials have determined whether the Tesla was operating on Autopilot, which can keep a car centered in its lane and a safe distance behind vehicles in front of it.
But it’s likely that either Autopilot or “Full Self-Driving” were in operation for the driver to be in the back seat. Tesla is allowing a limited number of owners to test its self-driving system.
Tesla, which has disbanded its public relations department, did not respond to messages seeking comment Wednesday.
The Fontana investigation, in addition to probes of two crashes in Michigan from earlier this year, show that NHTSA is taking a closer look at the Tesla systems. Experts say the agency needs to rein in such systems because people tend to trust them too much when they cannot drive themselves.
“I think they very likely are getting serious about this, and we may actually start to see some action in the not-too-distant future,” said Sam Abuelsamid, principal mobility analyst for Guidehouse Insights who follows automated systems.
“I definitely think that the increasing number of incidents is adding more fuel to the fire for NHTSA to do more,” said Missy Cummings, an electrical and computer engineering professor at Duke University who studies automated vehicles. “I do think they are going to be stronger about this.”
Tesla says on its website and in owners manuals that for both driver-assist systems, drivers must be ready to intervene at any time. But drivers have repeatedly zoned out with Autopilot in use, resulting in crashes in which neither the system nor the driver stopped for obstacles in the road.
The federal agency could declare Autopilot defective and require it to be recalled, or it could force Tesla to limit areas where Autopilot can be used to limited-access freeways. It could also make the company install a stronger system to ensure drivers are paying attention.
The auto industry, except for Tesla, already does a good job of limiting where such systems can operate, and is moving to self-regulate, Cummings said. Tesla seems to be heading that way. It’s now installing driver-facing cameras on recent models, she said.