Better safety through radar

JBee

Well-known member
First Name
JB
Joined
Nov 22, 2019
Threads
18
Messages
4,770
Reaction score
6,147
Location
Australia
Vehicles
Cybertruck
Occupation
. Professional Hobbyist
Country flag
I think there is a big problem with combining different types of sensors. I think most of us would agree camera vision is better at sensing the road environment (signs, stoplights, lane lines, etc.), and lidar / radar could exceed camera vision in adverse visual conditions (bad weather). But you have to remember that lidar and radar generate massive amounts of data, and the data is absolutely filled with noise. That noise has to be interpreted with something akin to the AI that is currently interpreting Tesla's visual camera data. It's just a huge extra computing cost to add, and even more likely for the AI to misinterpret that data.

And what would be the real benefit of adding lidar/radar to the current camera vision? Maybe the camera doesn't detect a moose crossing the road in the distance through fog, but the radar does. In this situation the computer/AI needs to decide which sensor to trust. If there's really a moose there, we'd be happy for it to trust the radar and not the cameras. But if there isn't really a moose there, and the radar was picking up a bunch of leaves blowing across the road, you'd be pissed for the car to slam on the brakes and potentially cause an accident.

So it's not as simple as "more sensors leads to better capabilities." Rather, "more sensors leads to more noise, and more ambiguous conflicts for the AI".

It took me a while to get there, but I'm firmly in the visual-only camp.
In the moose in the fog scenario radar detecting it first would simply result in the vehicle slowing earlier in the fog, until vision confirms it. It's not a either or situation if vision at first needs to know if any of its data is valid at all whilst driving in fog or heavy rain.

Rather its a case that Vision would be debiased depending on conditions, and likewise radar when vision is good.

The other item is the consequence of that data, in that an appropriate response can just be slow down until vision confirms brake etc. Slowing down generally has little ramifications if done in a controlled fashion and using rear vehicle distance information to do so. Don't forget vision and radar can track other vehicle speed and trajectory as well, meaning a calulation can run determining the chance of a rear ender as well as any side or foward impacts. This essentially puts you in a 360 safety bubble.

In general this "slowing down according to conditions" is a valid metric and something most sensible drivers do too. It also helps vision and radar to take more responsive actions with a greater safety margin for all vehicles involved.

From a risk management perspective its important to implement the right controls to mitigate risk in all modes and situations, by carefully filtering controls according to the full set of consequences, intended or not. Sometimes this means a control is implemented before approaching a high risk consequence and in this case would still make radar a valid supplement to the system.

If they had to stop car production because of the lack of radar components is however a completely different set of variables and how that was sold to customers something else again.
Sponsored

 

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,404
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
So it's not as simple as "more sensors leads to better capabilities." Rather, "more sensors leads to more noise, and more ambiguous conflicts for the AI".
Whether or not adding a sensor improves measurement depends on the sensor's noise characteristics and how adding it changes the "DOP". If you try to measure the distance to the church by shooting the top of the steeple with a sextance, move towards it a few feet and repeat you will get an answer but not a very good one because the accuracy of the sextant is probably poor relative to the amount the angle to the top of the steeple changes when you only move closer by a couple of feet. Now if you move in a lot, that's no longer true. The angle difference emerges from the "noise". Buying a more accurate sextant gives the same effect. Thus noise (K) and "geometry" (DOP) are both parts of the equation.

Mathematically the quality of the estimates is inversely proportional to the noise multiplied by the DOP (these are both matrices so this is a gross simplification). DOP is the reciprocal of a matrix whose size is increased by adding sensors so DOP goes down when you do that. Adding a sensor can only make accuracy worse if the sensors noise is so bad that its effect on K (the noise matrix) overwhelms the DOP matrix and the product goes up.

The argument made by Tesla was that the radar didn't decrease DOP enough to make it worthwhile to process its output. That's their design trade and you have to respect it. The problem is, of course, that if it is foggy or dark passive sensors can't see.
 

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,404
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
For those who know, in Beta FSD, does the car currently take full control in a fog?
No. How can it? If it can't see the white lines it is helpless and shuts down.

There is a stretch of interstate that runs south from Scranton through hill and that stretch gets incredibly heavy fog in the fall (and probably the rest of the year too). I've been through there twice once before they disabled the radar and once after. I would not have even considered using the autopilot in either of those cases. The sensible thing to do would be to get off the road but sitting hidden on the side of the road didn't seem like a good option either. Being able to follow the cars/trucks in front of me on the radar was a great comfort on the first trip. I really wish they would put it back and it seems they have come to their senses and are planning to do so.
 

JBee

Well-known member
First Name
JB
Joined
Nov 22, 2019
Threads
18
Messages
4,770
Reaction score
6,147
Location
Australia
Vehicles
Cybertruck
Occupation
. Professional Hobbyist
Country flag
No. How can it? If it can't see the white lines it is helpless and shuts down.

There is a stretch of interstate that runs south from Scranton through hill and that stretch gets incredibly heavy fog in the fall (and probably the rest of the year too). I've been through there twice once before they disabled the radar and once after. I would not have even considered using the autopilot in either of those cases. The sensible thing to do would be to get off the road but sitting hidden on the side of the road didn't seem like a good option either. Being able to follow the cars/trucks in front of me on the radar was a great comfort on the first trip. I really wish they would put it back and it seems they have come to their senses and are planning to do so.
Where did you hear they are bringing radar back? I'm interested because we have a MYP order inbound from China which is vision only now for Australia.
 

charliemagpie

Well-known member
First Name
Charlie
Joined
Jul 6, 2021
Threads
42
Messages
2,906
Reaction score
5,159
Location
Australia
Vehicles
CybrBEAST
Occupation
retired
Country flag
Elon said AI can see light photons. It would be able to see the line as clear as day.

Therein lies the answer. FSD beta shuts down.
 


ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,404
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
Where did you hear they are bringing radar back? I'm interested because we have a MYP order inbound from China which is vision only now for Australia.
I'm trying to remember. Obviously it was some article on Teslerati or Elecktrek or some place like that and it did not say "All Teslas will have radar starting with...". It was something like Tesla had applied to the FCC for a frequency authorization but I don't even remember well enough to say that was it.
 

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,404
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
Elon said AI can see light photons.
Uh, well, yes. That's how your eyes and cameras work. Don't know what AI has to do with it.

It would be able to see the line as clear as day.
If you can't see the lines a camera probably can't either unless it is sensitive to wavelengths appreciably bigger than the fog droplets. Those cameras get expensive.
 

JBee

Well-known member
First Name
JB
Joined
Nov 22, 2019
Threads
18
Messages
4,770
Reaction score
6,147
Location
Australia
Vehicles
Cybertruck
Occupation
. Professional Hobbyist
Country flag
I'm still hoping Tesla uses the Samsung ToF cameras in their deal. That would do ranging at least but only with visible light atm I think.
 

charliemagpie

Well-known member
First Name
Charlie
Joined
Jul 6, 2021
Threads
42
Messages
2,906
Reaction score
5,159
Location
Australia
Vehicles
CybrBEAST
Occupation
retired
Country flag
Uh, well, yes. That's how your eyes and cameras work. Don't know what AI has to do with it.

If you can't see the lines a camera probably can't either unless it is sensitive to wavelengths appreciably bigger than the fog droplets. Those cameras get expensive.
 

swengl

Well-known member
First Name
Steve
Joined
Sep 9, 2021
Threads
14
Messages
494
Reaction score
929
Location
United States
Vehicles
Model S, Model Y
Country flag
I have a 2015 S85D with AP1 (radar) and about the only time AP/TACC doesn't work is when snow builds up in the grill and blocks the sensor OR extremely heavy rain/snow. The good news: the car tells you that it can't use the radar and prevents you from activating AP/TACC.
 


ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,404
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
Your Tesla will see in the dark!
X-ray vison!

I'm afraid you are the victim of showmanship, technical naivete and Elon's faith in what AI can do.

Photons knock electrons photo sensors up into the conduction band (eyes or cameras) where they can be collected. Heat does this too. If you go into a cave and turn off the lights you will see point flashes of light. Some of these are from individual photons striking a rod or cone. Some are from electrons kicked up by the 300K temperature of your body. So you can see individual photons!

In a camera the situation is much the same. But with a camera there is a difference. You can integrate the accumulated charge for ony a brief period (high DIN/ASA setting) or for a longer time (low DIN/ASA). With the longer time you pick up more charge for a given light level and the signal grows proportionaly as does the noise (thermally induced electrons) but the variability in the noise shrinks as the interval grows and the signal to noise ratio is enhanced thereby. Besides this you can cool the sensor (and astronomers do this) with liquid nitrogen or helium and greatly reduce the number of thermal electrons thus enhancing SNR (signal to noise ratio) dramatically. In your eye the effective exposure time is what it is. You don't have any control over it. Scotopic (rod) vision is more sensitive than photopic (cone) vision. Given this I have no trouble accepting that a camera can be more sensitive to light but both "count" photons in the sense that the voltage sent onward is the sum of the number of photons that impinged during the effective time constant of the system.

The comments about pretty pictures are fluff. Because we see in the green half the pixels in a camera are green sensitive and 1/4 of the remainder are blue sensitive and 1/4 are red sensitive. The first bit of processing in a sensor is "de Bayering" which interpolates (or otherwise processes the sparse R and G pixels) to produce R, G and B values at each pixel location. Next comes "gamma correction" in which the sensor's native color space is translated to a standard color space where things like boosting of saturation to bright, cheery "Kodacolor" space takes place and finally some sort of sharpening (2D differentiation) is done before final transformation to an output space. All this, as they note, takes quite a few processing cycles and thus time (introducing latency), But there is NO REASON any of this processing needs to be done in a system that is not trying to replicate Kodacolor prints so none of that relates at all to the topic at hand. All the system needs to do is read the raw data from the camera and pass it on.

A helium cooled CCD in a 30 minute exposure has always been able to see stars I can't see so nothing new here. The only thing that is potentially different is the claim that with AI they can process a temporal sequence of images to locate objects in 4D. Like most AI this is, to quote Elons comments made at the beginning of each year, "Going to be available by the end of this year".
 

rr6013

Well-known member
First Name
Rex
Joined
Apr 22, 2020
Threads
54
Messages
1,680
Reaction score
1,620
Location
Coronado Bay Panama
Website
shorttakes.substack.com
Vehicles
1997 Tahoe 2 door 4x4
Occupation
Retired software developer and heavy commercial design builder
Country flag
Safety is what I want most from my Cyber Truck. (I want all the other great features too.) Therefore I want it to see what I can’t see with my eyes that could keep me safe when human vision can’t detect what is happening. The Cyber Truck should see in/through the fog, through the spray of water that comes up from a semi or other vehicles on the rainy crapy roads and freeways of the pacific northwest. I want it to see through the snow flying up from behind another vehicle so common in winter in cold climates like Montana etc. In order to do this it needs some kind of active vision like radar or lidar. If the FSD only works in fair weather I can do that, I want my amazing machine to do what I can’t do. To rely only on regular cameras is limiting. Unless Tesla can tell me otherwise that their camera can see through fog etc I want it to have radar or something to see what human vision can’t.
NHTSB is knocking. It has some facts that Tesla PUREvision can’t answer yet.

OP is right about Safety. Is it too much to ask, neh demand? In the case of semi-trailer pulled across lanes of travel? In case of emergency vehicles? In case of loss of lane demarcation? Road construction? Divided freeway abutments?

PUREvision appears 93%+ viable for general highways at legal speeds. That 7% remaining has Teslas killing people. MILLIONS of miles of data can’t fix it, yet. DOJO isn’t real in the deployment solution space.

Radar appears to be Tesla “good faith” actionable evidence to NHTSB that its fixing a liability it created.
 

Crissa

Well-known member
First Name
Crissa
Joined
Jul 8, 2020
Threads
127
Messages
16,592
Reaction score
27,644
Location
Santa Cruz
Vehicles
2014 Zero S, 2013 Mazda 3
Country flag
OP is right about Safety. Is it too much to ask, neh demand? In the case of semi-trailer pulled across lanes of travel? In case of emergency vehicles? In case of loss of lane demarcation? Road construction? Divided freeway abutments?
All of those collisions were with radar-enabled Teslas.

The vision-only vehicles were added a year and some ago and aren't involved that NTSB investigation of old collisions.

-Crissa
 

rr6013

Well-known member
First Name
Rex
Joined
Apr 22, 2020
Threads
54
Messages
1,680
Reaction score
1,620
Location
Coronado Bay Panama
Website
shorttakes.substack.com
Vehicles
1997 Tahoe 2 door 4x4
Occupation
Retired software developer and heavy commercial design builder
Country flag
All of those collisions were with radar-enabled Teslas.

The vision-only vehicles were added a year and some ago and aren't involved that NTSB investigation of old collisions.

-Crissa
Why now? Tesla tossed RADAR.

Its additive, layering and what purpose is PUREvision in need RADAR can fill?
 

CyberGus

Well-known member
First Name
Gus
Joined
May 22, 2021
Threads
69
Messages
5,991
Reaction score
19,650
Location
Austin, TX
Website
www.timeanddate.com
Vehicles
1981 DeLorean, 2024 Cybertruck
Occupation
IT Specialist
Country flag
Humans suck at probabilities and estimating risk. They buy lotto tickets hoping to win, but leave seatbelts unfastened because "what are the odds?"

Driving entails many risky choices. You can make an unprotected left at a blind corner and get away with it, sometimes. Autonomous driving eliminates mistakes due to inattention or driving recklessly, but the risks are still non-zero.

I wonder how much risk people will be willing to accept when they are no longer in control?
Sponsored

 
 




Top