Recent Tesla Full Self Driving Fails — Just A Blip In The Progress? Or Endemic?
Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
With Tesla Full Self Driving (FSD) version 12.3 being much more natural and usable than previous versions, I have been using it more in order to track its progress and try to understand it better. It’s still much more enjoyable to drive the car myself, but the technology is truly amazing, a marvel. To get to the level it’s at is a very cool achievement.
Though, I consistently return to my concern that both the hardware approach and the software approach are inadequate for robotaxi services.
One key issue is that the cameras are simply not always able to see well enough for FSD to operate. I have just one single Tesla, not 100, 1000, or one million. You need hardware that works well beyond 99% of the time in every FSD Tesla on the road in order to have a safe robotaxi service. If my one car is inadequate when it comes to the hardware, it’s clear that we are light years from the finish line. The other day, we were getting some light rain. It’s not summer and we aren’t yet being hit with huge thunderstorms, which I expect will present serious visibility challenges. This was quite light rain. I tried to turn on FSD to see how it did in these conditions. It wouldn’t turn on. Apparently, the little bit of rain we were getting was too much of a challenge for my car’s cameras.
So, there’s that.
There’s an argument that as the neural nets get to work, as Tesla processes more data and improves, the AI can operate under more and more difficult conditions even with current hardware. I’m not sold. I see it struggling too much relying on cameras. I see enough “edge cases” (common edge cases, I would actually call them) that I don’t think “vision only” is adequate. I may be wrong. People in the field believe I’m wrong. Other people in the field believe I’m right — or, I should say, have the same opinion.
Early-morning sunlight and late-day sunlight also cause visibility problems. Again, when Tesla’s cameras and FSD system face these problems, it could find a way to work around them just like humans do, but I’m not confident of that at the moment. At the least, we have to acknowledge it’s a problem today. We can all have different opinions regarding Tesla AI’s ability to solve the problem in time.
As I was scanning X/Twitter this morning to see what Tesla discussions are happening there, I also ran across these two tweets:
Tester:
FSD 12.3.4 turned left from a straight only lane coming out of a very popular mall parking lot. Someone was right next to me in the correct left turn lane & it cut them off. Talk about _awkward_! 😬
12.3.4 also tried to stop at a green light, almost creating a rear end…
— Jack Hopman (@jackhopman) April 11, 2024
My wife was driving with FSD 12.3.4 engaged. This is a two way stop, with fast 60+ mph cross traffic. It’s a dangerous intersection.
There was a car on our left waiting to turn left, waiting on opposing traffic to clear. Our Tesla jumped out into the middle of the road and had… pic.twitter.com/oQD7uIyWF1
— Simon (@_Falcon_Fury) April 15, 2024
1. “FSD 12.3.4 turned left from a straight only lane coming out of a very popular mall parking lot. Someone was right next to me in the correct left turn lane & it cut them off. Talk about _awkward_!“
Turning from the wrong lane and cutting off another car is not good. It’s shocking that this is still happening, in my humble opinion. Maybe it will be solved soon and this will stop happening. But it’s still a serious problem and highly dangerous.
2. “12.3.4 also tried to stop at a green light, almost creating a rear end collision.”
This one is even more shocking and even more concerning for me. How can the system still confuse a green light for a red light in some situations? How bad are the cameras in certain conditions or in certain ways if this is happening? If it’s purely a software issue instead of a hardware issue, what the heck went wrong? Of all the issues I’ve seen still occuring with version 12.3, this is the one that shocks and concerns me the most.
3. “There was a car on our left waiting to turn left, waiting on opposing traffic to clear. Our Tesla jumped out into the middle of the road and had a path planned that would have no doubt resulted in a major traffic accident with the fast moving SUV, had my wife not slammed on the brakes.”
This one feels like something that would be solved and avoided in the future with growing data and processing. However, it’s the scariest failure I’ve seen with the current version of FSD — it seems that slightly more faith in what FSD was doing or slightly less attentiveness from the driver and human lives could have been ended. It makes me think that we could have a very clear, heartbreaking, and public FSD death any day. And what would that do for Tesla FSD progress, by the way?
Robotaxis Right Around the Corner?
I do not claim to know what is going to happen with Tesla FSD and potential robotaxis in the coming months and years. I’ve seen too much and have listened to too many arguments on both sides. However, if I was forced to make a guess right now, I’d guess that Tesla’s approach is not going to work (again) and will need some serious restructuring. Yes, the system solves some problems as it changes, but I’m still concerned it creates other problems along the way as well. I’m concerned about what I wrote months ago almost two years ago — a “see saw problem” of solutions in certain cases leading to mistakes in other very similar but critically different cases. (I’m actually shocked to discover I wrote that article almost two years ago.)
Actually, I recommend going through the comments there. Many people were defending Tesla’s approach and claiming Elon knew best (as he always does) when it came to the approach the company was using then. It turns out, that approach was inadequate and was ditched after all. So, who was right at the time? Should we have just trusted Elon and his AI chops then? Should we have just trusted him and the Tesla team he was working with in 2020, or in 2018? You could say what Tesla was doing then led to where we are now in a clear and useful way, and that we had to start there to get here. Okay, that’s one argument. However, should we expect that now the approach is correct and we just need more time and data? Yes, the system is better today than it once was, but it is still so far from robotaxi capability that I think it takes a very large leap of faith — a reckless leap of faith — to assume that Tesla is now on the perfect path, or right path, to robotaxi capability.
Again, for extra fun, I highly recommend exploring the comment thread on that article again.
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Latest CleanTechnica.TV Video
CleanTechnica uses affiliate links. See our policy here.
This post has been syndicated from a third-party source. View the original article here.