Tesla removes ultrasonic sensors from new Model 3/Y builds, soon Model S/X

charliemagpie

Well-known member
First Name
Charlie
Joined
Jul 6, 2021
Threads
42
Messages
2,909
Reaction score
5,177
Location
Australia
Vehicles
CybrBEAST
Occupation
retired
Country flag
Computation is 100 times per second

Jeez, by the time we come up with examples, the AI would have computed 500 man hours of human thought.
And we are still turning our heads to check traffic coming from the other direction.

If we can see a car so can the AI.
If we don't have xray vision, the car doesn't need it. It can work out the risk and drive accordingly, better than any human.

It will soon handle any condition as good as 99% of drivers.

I don't know why we expect it needs to drive through a situation where a human wouldn't , or couldn't drive through.

Progressively, driver assist and FSD, will as a collective, make any of our concerns redundant.
Sponsored

 

firsttruck

Well-known member
Joined
Sep 25, 2020
Threads
178
Messages
2,576
Reaction score
4,111
Location
mx
Vehicles
none
Country flag
.....

If we can see a car so can the AI.
If we don't have xray vision, the car doesn't need it. It can work out the risk and drive accordingly, better than any human.
....
That is not true. If looking in a single direction with obstruction objects in specific location in the surrounding environment the human driver can see stuff when the B-pillar can not.

Has nothing to do with x-ray vision. It is simple physics.
The only cameras on the Tesla that can look directly to the side are the B-pillar cameras. That B-pillar location is almost half way down the length of the car from the front. Mot drivers that are 5'8" or taller can easily lean forward to see 1 foot - 2 feet better view than B-pillar.

Chuck Cook and other FSD Beta testers have shown this and a simple diagram proves this to be true. Chuck Cook talks about this in some of his videos.

Tesla camera FOV picture taken by itself is misleading.
 
Last edited:

Crissa

Well-known member
First Name
Crissa
Joined
Jul 8, 2020
Threads
127
Messages
16,727
Reaction score
27,819
Location
Santa Cruz
Vehicles
2014 Zero S, 2013 Mazda 3
Country flag
Yes, we can move our heads, but the AI has multiple camera angles we do not, and doesn't need to ignore most of its range of vision to do them.

it also can more, just like we can, it just moves the whole car to do so. Because of that, it can use a single camera to gauge the range of an object, where we rely on two.

-Crissa
 

JBee

Well-known member
First Name
JB
Joined
Nov 22, 2019
Threads
18
Messages
4,774
Reaction score
6,148
Location
Australia
Vehicles
Cybertruck
Occupation
. Professional Hobbyist
Country flag
FOV should be as best as possible, the b pillars are too far back, as are even the windscreen ones. Camera height is another issue in terrain and you can't see over a ridge offroad without a nose cam no matter what. Windscreen camera will only see the sky, just like the driver.

And no matter what the AI can't make-up what it couldn't at least get a glimpse of first. We don't live in a simulation. 😎
 

Jhodgesatmb

Well-known member
First Name
Jack
Joined
Dec 1, 2019
Threads
68
Messages
5,158
Reaction score
7,403
Location
San Francisco Bay area
Website
www.arbor-studios.com
Vehicles
Tesla Model Y LR, Tesla Model 3 LR
Occupation
Retired AI researcher
Country flag
What concerns me is that the cameras can be blocked or destroyed and that impacts what can be detected. I am constantly being told that one camera or another isn't able to detect the environment simply due to glare. That would not be acceptable for obstacle detection. I like the idea of fixed sensors simply because they provide a permanently-installed camera alternative. Until there are so many cameras on the vehicle that losing one doesn't impact the ability to resolve close obstacles I would be concerned about using the vision only system for obstacle detection and avoidance.

At some point last year Tesla talked about a new interior sensor that would be used to detect a child's motions and might be used to detect nefarious local activity. Might it be possible that this type of sensor could augment the vision system to aid in obstacle detection?
 


Crissa

Well-known member
First Name
Crissa
Joined
Jul 8, 2020
Threads
127
Messages
16,727
Reaction score
27,819
Location
Santa Cruz
Vehicles
2014 Zero S, 2013 Mazda 3
Country flag
The ultrasonics also can be blocked or destroyed?

They're actually more fragile. Trust me, I just tended some musical devices that used them at Burning Man.

Rumor has it there is a new high-def radar that would be on the outside, yes. No idea when it'll actually be implemented.

-Crissa
 

charliemagpie

Well-known member
First Name
Charlie
Joined
Jul 6, 2021
Threads
42
Messages
2,909
Reaction score
5,177
Location
Australia
Vehicles
CybrBEAST
Occupation
retired
Country flag
If the car can't see accurately, probably means the tail of 9's, doesn't pass 9999, or 99999.

Might as well this experiment now.

Or alternatively , let the eventual tens of millions of Tesla's drive around playing Russion Roulette.

It doesn't make sense to me that the car is winging it.
 

JBee

Well-known member
First Name
JB
Joined
Nov 22, 2019
Threads
18
Messages
4,774
Reaction score
6,148
Location
Australia
Vehicles
Cybertruck
Occupation
. Professional Hobbyist
Country flag
If the car can't see accurately, probably means the tail of 9's, doesn't pass 9999, or 99999.

Might as well this experiment now.

Or alternatively , let the eventual tens of millions of Tesla's drive around playing Russion Roulette.

It doesn't make sense to me that the car is winging it.
Not winging it, just not fully developed. There are some angles that are not yet optimised for. It's not like it was perfect when first conceived. Ideas and technologies change, adapt and should be improved upon. Offroad is also not Tesla standard habitat, they will need to adjust for it.

I'm hopeful we get the Samsung ToF cameras so they don't have to "waste" cpu cycles building accurate 3D maps.
 

Dids

Well-known member
First Name
Les
Joined
Dec 21, 2019
Threads
8
Messages
1,766
Reaction score
3,771
Location
Massachusetts
Vehicles
04 Tacoma, 23 Cybertruck
Occupation
Self
Country flag
Yes, we can move our heads, but the AI has multiple camera angles we do not, and doesn't need to ignore most of its range of vision to do them.

it also can more, just like we can, it just moves the whole car to do so. Because of that, it can use a single camera to gauge the range of an object, where we rely on two.

-Crissa
You point out an important fact. Humans use 2 spaced cameras to parallax, the car uses motion. Human parallax is detrimentted by motion of the human which is exactly what occurs in a car and add in motion of the target and humans become very bad at judging timing.
I suspect that the reason model 3 and y are the ones moving to vision only is because they have the smallest frontal occlusion.
I also know that the sonic sensors lull people into a false sense of security and sometimes fail to alert. Especially when backing up while turning resulting in people having dings on the corners of the bumper.
I'm guessing that Tesla has discovered that even in edge cases where an object is placed in front of a parked car the camera is as accurate as alerting as the front sonics are, would the sonics even beep for a kid flat on the ground directly in front? Can someone with a kid and sensored car please test this?
 

charliemagpie

Well-known member
First Name
Charlie
Joined
Jul 6, 2021
Threads
42
Messages
2,909
Reaction score
5,177
Location
Australia
Vehicles
CybrBEAST
Occupation
retired
Country flag
I wanted to avoid saying this because it seems too sci fi, but I believe it wont be long.

Cars will monitor each other. It is a simple thing.

For all of these edge cases.,. cars can simply police themselves. In the event a car notices an object which may be an issue in front of a parked Tesla car, It will be logged and relayed to the parked car.

Polling between cars would establish if the parked car is 'asleep' and may have been so when the object was placed there.

On 'awakening' the car will roll backwards for a good look.. if it can't go back it will ask a robotoxi to swing around and check its front.

If something is impeding progress, another car with Robot turns up and moves whatever it is out of the way. Or a human does it.


Seems farfetched, but will be normal by the time Tesla rolls out its 25 millionth car by 2027.


Btw... In Au we have been progressively receiving more and more Tesla. I am seeing Tesla daily now. The other day at an intersection I could see 3.

California could probably self monitor now to a high degree.
Sponsored

 
 




Top