Energy

Analyst & Professor Claim Tesla FSD Isn’t Ready For Prime Time & Won’t Be Any Time Soon

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!


The Associated Press reported on August 28, 2024 that William Stein, a technology analyst at Truist Securities, has taken Elon Musk up on his invitation to try the latest versions of Tesla’s Full Self Driving system three times in the past four months. A Tesla equipped with the technology can travel from point to point with little human intervention, the company says, yet each time Stein drove one, the vehicle made unsafe or illegal maneuvers. In fact, his most recent test drive in August left his 16 year old son “terrified.” That’s troubling news for a company that says it will present a working prototype of a robotaxi in October.

Musk has told investors it is possible that Full Self Driving will be able to operate more safely than human drivers by the end of this year, if not by next year, but Stein says he doubts Tesla is anywhere close to deploying a fleet of autonomous robotaxis.

Note: See CleanTechnica’s latest test drive of FSD here: Tesla FSD 12.5 Test — Like A Perfect, Smooth Robotaxi (CleanTechnica Video).

Testing Tesla Full Self Driving

For his latest test, Stein drove a rear-wheel-drive Tesla Model 3 which he picked up at a Tesla showroom in Westchester County northeast of New York City. The car, which is the least expensive model offered by Tesla in the US, was equipped with the latest Full Self Driving software. During his ride, Stein said, the Tesla felt smooth and more human-like than with past FSD versions. But in a trip of less than 10 miles, he said the car made a left turn from a through lane while running a red light. “That was stunning,” Stein said.

He said he did not take control of the car because there was little traffic in the area, which made the maneuver seem less dangerous. Later the car drove down the middle of a parkway, straddling two lanes that carry traffic in the same direction. This time, Stein said, he intervened. The latest version of Full Self-Driving, Stein wrote to investors, does not “solve autonomy” as Musk has predicted. Nor does it “appear to approach robotaxi capabilities.” During two earlier test drives he took, in April and July, Stein said Tesla vehicles also surprised him with unsafe moves.

Stein said that while he thinks Tesla will eventually make money off its driving technology, he doesn’t foresee a robotaxi with no driver and a passenger in the back seat in the near future. He predicted it will be significantly delayed or limited in where it can travel. There’s often a significant gap, he pointed out, between what Musk says and what actually happens. While many Tesla fans have posted videos on social media showing their cars driving themselves without humans taking control, others have posted videos showing the cars doing dangerous things.

Alain Kornhauser, who heads autonomous vehicle studies at Princeton University, said he drove a Tesla borrowed from a friend for two weeks and found that it consistently spotted pedestrians and detected other drivers. Yet while it performs well most of the time, he had to take control when the Tesla made moves that scared him. He warns that Full Self Driving isn’t ready to be left without human supervision in all locations. “This thing,” he said, “is not at a point where it can go anywhere.” He does think the system could work autonomously in smaller areas of a city where detailed maps help guide the vehicles and wonders why Musk doesn’t start by offering rides on a smaller scale. “People could really use the mobility that this could provide,” he said.

For years, experts have warned that Tesla’s system of cameras and computers isn’t always able to spot objects and determine what they are. Cameras can’t always see in bad weather and darkness. Most other autonomous robotaxi companies, such as Alphabet Inc.’s Waymo and General Motors’ Cruise, combine cameras with radar and laser sensors.

Machine Learning Lacks Common Sense

Phil Koopman, a professor at Carnegie Mellon University who studies autonomous vehicle safety, said it will be many years before autonomous vehicles that operate solely on artificial intelligence will be able to handle all real-world situations. “Machine learning has no common sense and learns narrowly from a huge number of examples. If the computer gets into a situation it has not been taught about, it is prone to crashing.”

Last April near Seattle, a Tesla using Full Self Driving hit and killed a motorcyclist. The Tesla driver, who has not yet been charged, told authorities that he was using Full Self Driving while looking at his phone when the car rear ended the motorcyclist, who was pronounced dead at the scene. NHTSA told AP it is evaluating information about the fatal crash provided by Tesla and law enforcement officials. It said it is aware of Stein’s experience with Full Self Driving.

NHTSA also noted that it is investigating whether a Tesla recall earlier this year that was intended to bolster its automated vehicle driver monitoring system actually succeeded.

Today, there are about a half million Tesla vehicles on the road in the US with the Full Self Driving suite installed.

The Takeaway

We have danced around the Full Self Driving maypole many times in the past and will probably continue to do so in the future. Executive editor Zachary Shahan has filed numerous reports about his personal experiences with the system, most of them favorable but still noting things the system does that it is not supposed to. It is my personal opinion, which is worth precisely what you paid for it, that no vehicle that uses only cameras will ever be able to drive itself safely without human supervision.

Other companies limit the use of their self-driving systems to certain roads and many use digital maps to guide them in urban driving settings. Musk, however, knows better and is determined that his cars will drive themselves — eventually — without the aid of lidar, radar, or digital maps. He is either a genius or a pigheaded fool. The jury is still out on that question.

I have one final thought, which no doubt will annoy some readers. I am a former motorcycle rider and I can attest that I never, under any circumstances, gave my knowing consent to some jackass staring at his cell phone instead of looking at the road to rear end me and end my life prematurely. Among all the hype and hope about self-driving cars, the general public has a right to be protected from experiments on public roads that put innocent drivers at risk of death or serious bodily harm. They are offered no compensation to be guinea pigs in one of Elon’s cockamamie experiments. So far, federal and state regulators have utterly failed in their duty to protect the motoring public from the whims of a tech bro who is incapable of taking into account the safety of all drivers, not just those who buy his products.


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.


Latest CleanTechnica.TV Videos

Advertisement


 


CleanTechnica uses affiliate links. See our policy here.

CleanTechnica’s Comment Policy


This post has been syndicated from a third-party source. View the original article here.

Related Articles

Back to top button