Anyone else concerned about fog without radar?

tidmutt

Well-known member
First Name
Daniel
Joined
Feb 25, 2020
Threads
8
Messages
603
Reaction score
992
Location
Somewhere hot and humid
Vehicles
Model Y Performance, Model X P100D
Occupation
Software Architect
Country flag
Yes, you can, using the type of video Tesla uses, find exactly the location of a single point of light.

First, it has parallax between the different cameras - three forward ones in this case - to determine the light's position in space. (This is how we use still images to know where stars are.)

Second, it detects edges in the image, so it knows how many pixels any light it sees takes up. Then it compares this to a previous image. This also tells it positional information, as things changing size are usually also moving towards or away from you.

Third, whenever it detects a light, it gives it a color value and then begins acting. Just like a human, before it even knows where it is.


But this is why it's got us right now, it's just a newbie driver, subject to all the foibles that vision has. Two heads are better than one. Eventually, it'll be taught all these things. Who knew you'd have to teach a car some basic astronomy? Well...

-Crissa
Honestly, I find that kind of awesome.
Sponsored

 

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,403
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
I understand the geometry constraints on range estimation accuracy just fine. You have a telescope that It's actually much more involved than you have discussed.
That's great because I can then use the language of engineering to explain what it is I am trying to get across.

With my telephoto lens and DSLR, knowing the diameter of Saturn, I can estimate the distance from earth. It looks like a point source of light to the naked eye but the theoretical limits of range estimation accuracy are only dictated by the length of the lens and the density of pixels. As objects become more distant and smaller, the theoretical limits of light come into play. But that is not a concern relative to autonomy.
And I think this is a great (and fun) example to look at.
Tesla Cybertruck Anyone else concerned about fog without radar? Equations


As you are familiar with the language you will recognize these equations but as different texts use different symbols and you may be a little rusty the first equation says that we have a sensor suite that produces, over time, a set of measurements which are the elements of the vector r which is a vector function of a set of parameters we need to estimate, x, and a set of parameters which we cannot, do not wish to or do not need to estimate, p. As the world is not perfect the sensors' measurements are corrupted by noise represented by the vector n.

We have a system that processes r and comes up with an estimate for x considering p. I'm sure you recall that those are called the "consider parameters". In he Saturn example p contains the diameter of Saturn that we need to convert the image size to a range estimate. We are interested in the quality of our estimate as represented by the elements of the covariance matrix Kx. It's clear from the formula that this depends on the quality of the sensor measurements (Kn), the uncertainty in the consider parameters (Kp) and a couple of matrices which describe the sensitivity of x to small changes in, respectively, r (A matrix) and p (B matrix). They are just the partials of r with respect to x and p.

Given that you understand the above there is no need for me to lecture you further and no need for me to respond to your other comments in this post because if you really understand that second equation you will realize that all the things you mention such as surround boxes, measurement over time, perspective, vanishing points, changes of size over time are all juat measurements that plug right into the equation.

I am a little suspicious because if you really understood you should have picked that up. Also some of your comments on AI and optics are a little naive. For example if AI doesn't compute any answers how does it synthesize commands to send to the car's servos? The chips do 10's of TOPs each. If those TOPs aren't computing anything what are they doing?

Finally I'll comment that I went looking for the other forward looking cameras on my X. I couldn't find any!
 

HaulingAss

Well-known member
First Name
Mike
Joined
Oct 3, 2020
Threads
9
Messages
4,486
Reaction score
9,454
Location
Washington State
Vehicles
2010 F-150, 2018 Model 3 Perform, FS Cybertruck
Country flag
I am a little suspicious because if you really understood you should have picked that up. Also some of your comments on AI and optics are a little naive. For example if AI doesn't compute any answers how does it synthesize commands to send to the car's servos? The chips do 10's of TOPs each. If those TOPs aren't computing anything what are they doing?
There you go again being arrogant while simultaneously demonstrating you don't understand how Tesla's FSD works. This isn't really the forum to bring you up to speed on all the foundational AI understanding you will need to start understanding the basics. I don't really have time right now to compile a bunch of sources because you're going to need a lot for a good foundation. I've been studying it for over 4 years pretty heavily and I'm still far from an expert but I assume you are familiar with searching out knowledge on new topics on your own? It'll be worth it.
 

Crissa

Well-known member
First Name
Crissa
Joined
Jul 8, 2020
Threads
126
Messages
16,211
Reaction score
27,072
Location
Santa Cruz
Vehicles
2014 Zero S, 2013 Mazda 3
Country flag
The math is fun and all, but I learned about twenty years ago that it's not appropriate to bring up unless you define all variables in situ and even then, including error bars is too complex and clouds the message you're trying to send. It's just best to handwave those functions and say they exist.

And the math is highly unimportant to the basic order of operations. For instance, one of the reasons that the car is slowing down is because it chooses to act upon yellow lights before it has given them a position in space. That's really the safe thing to do, especially while you have a backup driver to tell you if you've gotten it wrong.

-Crissa
 

webroady

Member
First Name
Mike
Joined
Jul 8, 2021
Threads
1
Messages
21
Reaction score
31
Location
Indiana
Vehicles
2008 Toyota Scion xB, 2009 F250
Occupation
retired
Country flag
I‘m sure the cameras outperform people, but I’m wondering how they could possibly outperform radar.
Radar has no hope of seeing lane markings on the road, recognizing traffic lights and knowing what color they are or if they're blinking, recognizing traffic sign content (construction site flagger holding slow/stop sign, detour, lane metering), emergency vehicle lights, etc. So it's obvious to me that vision is necessary.

I am a bit uneasy about lack of radar in fog, but the truck will obviously have headlights that maybe could be adaptable for fog and cameras can definitely see better than humans. I hope Tesla does a lot of testing in your foggy conditions.
 


HaulingAss

Well-known member
First Name
Mike
Joined
Oct 3, 2020
Threads
9
Messages
4,486
Reaction score
9,454
Location
Washington State
Vehicles
2010 F-150, 2018 Model 3 Perform, FS Cybertruck
Country flag
Radar has no hope of seeing lane markings on the road, recognizing traffic lights and knowing what color they are or if they're blinking, recognizing traffic sign content (construction site flagger holding slow/stop sign, detour, lane metering), emergency vehicle lights, etc. So it's obvious to me that vision is necessary.

I am a bit uneasy about lack of radar in fog, but the truck will obviously have headlights that maybe could be adaptable for fog and cameras can definitely see better than humans. I hope Tesla does a lot of testing in your foggy conditions.
People don't use radar to drive in fog and cameras alone will allow autonomous fog driving on par with humans or better. Radar would not allow faster speeds anyway because, as you have pointed out, radar cannot see traffic lights, lane markings or read signs. Plus, it would be unsafe for an autonomous car to be barreling along at 45 or 50 mph when everyone else was going 25 mph.
 

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,403
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
There you go again being arrogant while simultaneously demonstrating you don't understand how Tesla's FSD works.
Of course I don't know how their system works. Where would I get that information?


This isn't really the forum to bring you up to speed on all the foundational AI understanding you will need to start understanding the basics. I don't really have time right now to compile a bunch of sources because you're going to need a lot for a good foundation. I've been studying it for over 4 years pretty heavily and I'm still far from an expert but I assume you are familiar with searching out knowledge on new topics on your own? It'll be worth it.
I have dabbled in AI from time to time. I just remembered that another guy and I tried to "build" (in software) a perceptron that predicted the weather I think at Cornell (where, I just found out, the perceptron was invented). I used to have an AI section in my department. Have fiddled with fuzzy control a bit only to find out later that it is considered a branch of AI. But I'm certainly no expert.

I applaud your efforts to educate yourself on the subject but I really think you need to focus less on AI itself for the moment and more on where it fits into the broader problem of adaptive control of systems. The "good foundation" you refer to is achieved when you thoroughly understand the systems engineering problem. Then, and only then, will you have the perspective to appreciate when and where an AI solution may aid in solving the system problem. And when it may not.

I'm tempted to suggest starting at the beginning with Wiener's "Cybernetics". This should be of particular interest to everyone here as Wiener coined the term "Cybernetics" after which our truck was named. But I must caution that the book gets a little heavy into the math. You might want to start with the Cybernetics article on Wikipedia.

I'll also point out that I didn't learn about fuzzy logic by reading a couple of books on it. I learned about it by building fuzzy controllers in software (using what I got from the books).
 

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,403
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
Radar has no hope of seeing lane markings on the road, recognizing traffic lights and knowing what color they are or if they're blinking, recognizing traffic sign content (construction site flagger holding slow/stop sign, detour, lane metering), emergency vehicle lights, etc. So it's obvious to me that vision is necessary.
Yes, it is. I don't think anyone has suggested getting rid of vision and relying solely on radar and sonar. It's the converse. The Tesla autopilot relies on being able to see those lane demarcations and that is done with the cameras. No lane demarcation? Autopilot shuts off.

Now if FHA decides that the "white lines" have to be laid down with radar reflective paint things might be different.

The main issue in this thread is that folks are appreciating that in general the more sensors the better the state estimate the vehicle needs to avoid running into things. But adding more sensors doesn't necessarily bring enough improvement to justify the cost or effort of adding it. That's what Tesla is saying. The radar does not bring enough to the table to warrant fixing it.
 

HaulingAss

Well-known member
First Name
Mike
Joined
Oct 3, 2020
Threads
9
Messages
4,486
Reaction score
9,454
Location
Washington State
Vehicles
2010 F-150, 2018 Model 3 Perform, FS Cybertruck
Country flag
The main issue in this thread is that folks are appreciating that in general the more sensors the better the state estimate the vehicle needs to avoid running into things. But adding more sensors doesn't necessarily bring enough improvement to justify the cost or effort of adding it. That's what Tesla is saying. The radar does not bring enough to the table to warrant fixing it.
No, that's not what Tesla is saying. They have told us that vision only will perform at a higher level than vision plus radar. The fundamental problem is how to handle the instances where radar and vision disagree. By looking at the problem using basic logic, they found the resources to solve the vision/radar disagreement would be more than simply using those resources to process the visual data more completely (as humans do to drive). Humans don't use radar to drive.
 

Crissa

Well-known member
First Name
Crissa
Joined
Jul 8, 2020
Threads
126
Messages
16,211
Reaction score
27,072
Location
Santa Cruz
Vehicles
2014 Zero S, 2013 Mazda 3
Country flag
The problem of radar-vision disagreement is that the two systems always are looking for the same objects.

It's a set problem. Vision correctly detects [A, B, C, E, F] and is wrong in case [G]. Radar correctly detects [B, D, F] and incorrectly reports [E, G].

When there's a report, you can't tell whether it's [D, E, G] from the two systems.

Which system do you choose?

Elon basically said that the edge case that radar was right was swamped out by an order of magnitude by the situations in which it was wrong. They were ignoring so often that even when it was right, they had to ignore it because vision already had come up with a solution.

-Crissa
 


ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,403
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
No, that's not what Tesla is saying.
Evidently you missed it but a presentation by Telas's Dr. Karpathy was posted in No. 23. You should watch that and pay particular attention around 26:55 et seq.

They have told us that vision only will perform at a higher level than vision plus radar.
No. Actually what he says is that they didn't handle the fusion right and it's not worth fixing this as vision by itself is good enough. He would never say vision by itself is better than vision plus radar because it is untrue unless the radar is faulty and Tesla's is. Can it be fixed? He says it can. Is it worth doing? He says it is not.

What we hear is influenced by our previous experiences and knowledge. What I hear from Dr. Karpathy is, evidently, quite different from what you or my wife would hear.

Humans don't use radar to drive.
Yes, yes they do and sonar and GPS too. Were they able to drive in the past without those things? Yes, of course they were so you might question whether they are really needed. But the goal of FSD is not only to save labour but to make transport safer. Vision certainly has a role to play in this but so do sonar, radar, lidar, GPS, Glonass, electronic guideways...
 

HaulingAss

Well-known member
First Name
Mike
Joined
Oct 3, 2020
Threads
9
Messages
4,486
Reaction score
9,454
Location
Washington State
Vehicles
2010 F-150, 2018 Model 3 Perform, FS Cybertruck
Country flag
Evidently you missed it but a presentation by Telas's Dr. Karpathy was posted in No. 23. You should watch that and pay particular attention around 26:55 et seq.
Yes, I already saw that, I just interpret it differently. In that there is a time penalty with fusing radar and vision such that it will perform better than vision alone. There is also a resource penalty in terms of the hardware already on cars. So vision will outperform vision plus radar for an unknowable amount of time. Potentially indefinitely. That's what Karpathy essentially said.

No. Actually what he says is that they didn't handle the fusion right and it's not worth fixing this as vision by itself is good enough. He would never say vision by itself is better than vision plus radar because it is untrue unless the radar is faulty and Tesla's is. Can it be fixed? He says it can. Is it worth doing? He says it is not.
He also doesn't say vision plus radar will outperform vision only. All he says is they can fix the current conflicts, given enough time. That does not imply it will outperform a vision only system that is also upgraded to take advantage of the resources previously consumed by radar.

Sometimes the devil is in the details!

What we hear is influenced by our previous experiences and knowledge. What I hear from Dr. Karpathy is, evidently, quite different from what you or my wife would hear.
Obviously I heard it differently than you because you failed to consider the system resources consumed by the fusion of radar and vision, probably because you have little familiarity with how artificial intelligence and machine vision actually work. By applying those resources to better vision processing, it's likely to outperform an optimized version of radar and vision on the same hardware. That's what Karpathy has actually said.

Yes, yes they do and sonar and GPS too. Were they able to drive in the past without those things? Yes, of course they were so you might question whether they are really needed. But the goal of FSD is not only to save labour but to make transport safer. Vision certainly has a role to play in this but so do sonar, radar, lidar, GPS, Glonass, electronic guideways...
Karpathy has decided radar does not have a viable place in FSD. At least not the way he's planning to implement it. Were you not listening? And, yes, he's planning to make it safer than human driving is.
 
Last edited:

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,403
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
I think it is time for us to mutually realize that I can no more set aside the perspective gained in practicing systems engineering for 50 yrs than you can gain it in 4 years of reading books. This means that we will never converge. Keep studying and you may one day gain enough to understand what is going on here - or understand it better than you do now, at any rate.
 

Crissa

Well-known member
First Name
Crissa
Joined
Jul 8, 2020
Threads
126
Messages
16,211
Reaction score
27,072
Location
Santa Cruz
Vehicles
2014 Zero S, 2013 Mazda 3
Country flag
I think it is time for us to mutually realize that I can no more set aside the perspective gained in practicing systems engineering for 50 yrs than you can gain it...
Dude, nothing you posted in this thread is engineering. It's not programming, either.

And telling people they can't understand your brilliance is uncool.

-Crissa
Sponsored

 
 




Top