DETROIT — Earlier this week, Tesla sent out its “full self-driving” software to a small group of owners who will test it on public roads. But buried on its website is a disclaimer that the $8,000 system doesn't make the vehicles autonomous and drivers still have to supervise it.
The conflicting messages have experts in the field accusing Tesla of deceptive, irresponsible marketing that could make the roads more dangerous as the system is rolled out to as many as 1 million electric vehicle drivers by the end of the year.
“This is actively misleading people about the capabilities of the system, based on the information I've seen about it,” said Steven Shladover, a research engineer at the University of California, Berkeley, who has studied autonomous driving for 40 years. “It is a very limited functionality that still requires constant driver supervision.”
On a conference call Wednesday, Musk told industry analysts that the company is starting full self-driving slowly and cautiously “because the world is a complex and messy place.” It plans to add drivers this weekend and hopes to have a wider release by the end of the year. He referred to having a million vehicles “providing feedback” on situations that can’t be anticipated.
The company hasn’t identified the drivers or said where they are located. Messages were left Thursday seeking comment from Tesla.
The National Highway Traffic Safety Administration, which regulates automakers, says it will monitor the Teslas closely “and will not hesitate to take action to protect the public against unreasonable risks to safety.”
The agency says in a statement that it has been briefed on Tesla’s system, which it considers to be an expansion of driver assistance software, which requires human supervision.
“No vehicle available for purchase today is capable of driving itself,” the statement said.
On its website, Tesla touts in large font its full self-driving capability. In smaller font, it warns: “The currently enabled features require active driver supervision and do not make the vehicle autonomous. The activation and use of these features are dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions.”
Even before using the term “full self-driving,” Tesla named its driver-assist system “Autopilot." Many drivers relied on it too much and checked out, resulting in at least three U.S. deaths. The National Transportation Safety Board faulted Tesla in those fatal crashes for letting drivers avoid paying attention and failing to limit where Autopilot can be used.
Board members, who have no regulatory powers, have said they are frustrated that safety recommendations have been ignored by Tesla and NHTSA.
Bryant Walker Smith, a University of South Carolina law professor who studies autonomous vehicles, said it was bad enough that Tesla was using the term “Autopilot” to describe its system but elevating it to “full self-driving” is even worse.
“That leaves the domain of the misleading and irresponsible to something that could be called fraudulent,” Walker Smith said.
The Society of Automotive Engineers, or SAE, has developed five levels to describe the functions of autonomous vehicles. In levels zero through two, humans are driving the cars and supervising partially automated functions. In levels three through five, the vehicles are driving, with level five describing a vehicle being driven under all traffic and weather conditions.
The term “full self-driving” means there is no driver other than the vehicle itself, indicating that it would be appropriate to put no one in the vehicle, Walker Smith said.
Musk also said on Wednesday that Tesla would focus on setting up a robotaxi system where one person could manage a fleet of 10 self-driving cars in a ride hailing system.
“It wouldn't be very difficult, but we're going to just be focused on just having an autonomous network that has sort of elements of Uber, Lyft, Airbnb,” he said.
Tesla is among 60 companies with permits to operate autonomous vehicles with human backup drivers in California, the No. 1 state for Tesla sales. The companies are required to file reports with regulators documenting when the robotic system experiences a problem that requires the driver to take control - a mandate that could entangle the owners of Tesla vehicles in red tape.
Before Tesla is able to put fully self-driving vehicles on California roads, it will have to get another permit from state regulators. Only five companies, including Google spin-off Waymo and General Motors’ Cruise subsidiary, have obtained those permits.
The California Department of Motor Vehicles didn’t immediately respond to questions about Tesla’s latest plans for robotic cars.
NHTSA, which has shied away from imposing regulations for fear of stifling safety innovation, says that every state holds drivers accountable for the safe operation of their vehicles.
Walker Smith argues that the agency is placing too much of the responsibility on Tesla drivers when it should be asking what automakers are going to do to make sure the vehicles are safe. At the same time, he says that testing the system with vehicle drivers could be beneficial and speed adoption of autonomous vehicles.
Thursday afternoon, Musk was clearly trying to sell the full self-driving software. He wrote on Twitter that the price of “FSD beta” will rise by $2,000 on Monday.