Three times in the past four months, William Stein, a technology analyst at Truist Securities, has taken Elon Musk up on his invitation to try the latest versions of Tesla’s vaunted “Full Self-Driving” system.
A Tesla equipped with the technology, the company says, can travel from point to point with little human intervention.
Yet each time Stein drove one of the cars, he said, the vehicle made unsafe or illegal manoeuvres. He said his most recent test drive earlier this month left his 16-year-old son, who accompanied him, “terrified.”
Stein’s experiences, along with a Seattle-area Tesla crash involving Full Self-Driving that killed a motorcyclist in April, have drawn the attention of federal regulators in the U.S. They have been investigating Tesla’s automated driving systems for more than two years because of dozens of crashes that raised safety concerns.
The problems have led people who monitor autonomous vehicles to become more skeptical that Tesla’s automated system will ever be able to operate safely on a widespread scale.
The latest incidents come at a pivotal time for Tesla. Musk has told investors it’s possible that Full Self-Driving will be able to operate more safely than human drivers by the end of this year, if not next year.
And in less than two months, the company is scheduled to unveil a vehicle built expressly to be a robotaxi. For Tesla to put robotaxis on the road, they would have to meet national U.S. standards for vehicle safety. Musk has said the company will show regulators that the system can drive more safely than humans.
Musk has released data showing miles driven per crash, but only for Tesla’s less-sophisticated Autopilot system.
Self-driving feature already in use
Full Self-Driving is being used on public roads by roughly 500,000 Tesla owners — slightly more than one in five Teslas in use today. Most of them paid $8,000 US or more for the optional system.
The company has cautioned that cars equipped with the system cannot actually drive themselves and that motorists must be ready at all times to intervene if necessary. Tesla also says it tracks each driver’s behaviour and will suspend their ability to use Full Self-Driving if they don’t properly monitor the system.
Musk, who has acknowledged that his past predictions for the use of autonomous driving proved too optimistic, in 2019 promised a fleet of autonomous vehicles by the end of 2020. Five years later, many who follow the technology say they doubt it can work across the U.S. as promised.
“It’s not even close, and it’s not going to be next year,” said Michael Brooks, executive director of the Center for Auto Safety.
The car that Stein drove was a Tesla Model 3, which he picked up at a Tesla showroom in Westchester County, north of New York City.
The car, Tesla’s lowest-price vehicle, was equipped with the latest Full Self-Driving software, which Musk says now uses artificial intelligence to help control steering and pedals.
Unpredictable behaviour
During his ride, Stein said, the Tesla felt smooth and more human-like than past versions. But in a trip of less than 10 miles (16 kilometres), he said the car made a left turn from a through lane while running a red light.
“That was stunning,” Stein said, adding that he didn’t take control, because the road was empty.
Later, the car drove down the middle of a parkway, straddling two lanes that carry traffic in the same direction. This time, Stein said, he intervened.
The latest version of Full Self-Driving, Stein wrote to investors, does not “solve autonomy” as Musk has predicted. Nor does it “appear to approach robotaxi capabilities.”
Tesla has not responded to messages seeking a comment.
Stein said that while he thinks Tesla will eventually make money off its driving technology, he doesn’t foresee a robotaxi with no driver and a passenger in the back seat in the near future. He predicted it will be significantly delayed or limited in where it can travel.
There’s often a significant gap, Stein pointed out, between what Musk says and what is likely to happen.
Many Tesla fans have posted videos on social media showing their cars driving themselves without humans taking control, but others have posted videos showing dangerous behaviour.
Expert warnings
For years, experts have warned that Tesla’s system of cameras and computers isn’t always able to spot objects and determine what they are. Cameras can’t always see in bad weather and darkness. Most other autonomous robotaxi companies combine cameras with radar and laser sensors, but experts say even these can’t always drive reliably yet.
Last April in Snohomish County, Washington, near Seattle, a Tesla using Full Self-Driving hit and killed a motorcyclist, authorities said.
The National Highway Traffic Safety Administration said it’s evaluating information on the fatal crash from Tesla and law enforcement officials. It also says it’s aware of Stein’s experience with Full Self-Driving.
As Tesla electric vehicle sales have faltered for the past several months despite price cuts, Musk has told investors that they should view the company more as a robotics and artificial intelligence business than a car company. Yet Tesla has been working on Full Self-Driving since at least 2015.
“I recommend anyone who doesn’t believe that Tesla will solve vehicle autonomy should not hold Tesla stock,” he said during an earnings conference call last month.
Source Agencies