JCERRN
Well-known member
- First Name
- John
- Joined
- May 3, 2021
- Threads
- 20
- Messages
- 492
- Reaction score
- 570
- Location
- Around
- Vehicles
- Yeah

You said it yourself, its better than it was. With more data, more real world scenarios, ideally it will continue to improve. The measurement in inches was possible due to the Ultrasonic sensors that Tesla used to equip their vehicles with. Tesla is making a bet/gamble with vision only autonomy that will make or break the whole concept. Other companies (including Rivian apparently) completely disagree with the approach and are even building their cars to potentially support Lidar. By buying a Tesla, you are buying into their approach. Time will tell if they are right about AI vision based autonomy. I would wager yes, but it will take a lot of time and a whole lot of compute. Probably not HW4 or even 5.It was never "unsupervised", was watching everything intently. It made a sudden small, quick move that jammed it against the pole before I could react. And there are always grey areas in such documents.
In some states, there are specific laws that say you cannot "sign away" your rights. I'm not sure how widespread that is but it should be universal. Releases, waivers, NDAs, etc. are usually crafted by lawyers to try to shift all of the responsibility to the other party. That's what they are for but sometimes they don't stand up in court because they overreach. I have seen releases that say they have no liability even in the case of willful actions of misconduct on their part. So one can never assume that there's no course of action. Of course we all know, and I said, that they would give the "supervised" answer. So, I don't "expect" anything from Tesla, but I will discuss it with them and see their reaction.
I report this because it has some different characteristics than the typical FSD issues I have experienced and read about. Maybe others will find this provocative and useful.
1. This was not a dynamic problem. The vehicle was essentially still.
2. The auto park movements can be very jerky on CT -- it appears to be iterative, which is how it got itself in a jam.
3. Resolution of the data space is very low compared to previous Tesla equipment with other sensors. Our original M3 showed very clear images of objects and gave accurate distances (in inches). It seems to be a Musk quest to do it all with "vision". That's kind, of foolish from my perspective, but that's how it is.
4. As a result, it is not aware of its tolerances with enough accuracy to avoid hitting a stationary object at very low speeds -- say less than an inch/sec. This did surprise me. After all, it has the proximity "heat maps".
5. In close situations, it seems not to model the movement of the car accurately, e.g., not taking rear steering into account (I'm guessing).
6. I'm guessing that these all worked together to allow the scrape.
As far as liability, I expect Tesla is completely aware of these issues but I don't recall any specific discussion of these aspects in regards to FSD and Auto Park. So there could be issues with them withholding or downplaying inconvenient information. Indeed, their thrust is to try like crazy to convince people to buy and use FSD and then our of the other side of their mouth they say, but you can't trust it. This is an ambiguous position to say the least.
I would say, generally, it's better than it was (except for the dumbing down of the sensors), but it has always been worse than Tesla portrays it. So I think there is probably shared responsibility for such things. I am culpable for trusting it too much, that's for sure. But I think they should have said more about tolerances and perhaps given some guidance from experimental data. Maybe they did and I missed it?
Take it all for what it's worth to you, it's just FYI and IMO.
Sponsored