FSD Using Cameras in Rain??

flowerlandfilms

Well-known member
First Name
Eryk
Joined
Dec 6, 2020
Threads
3
Messages
372
Reaction score
721
Location
Australia
Vehicles
Yamaha SRV-250
Occupation
Film Maker
Country flag
Anyone who does Astrophotography knows that cameras have filters in them to get rid of Infrared and Ultraviolet light on the spectrum. Just little bits of plastic/glass in every camera so they can more accurately represent what humans see for their holiday snaps. But the CCD is perfectly capable of seeing that extra part of the spectrum.

Tesla could easily delete that part and get more information in camera. I'm certainly not enough of an expert to know whether or not that information would be noise or useful data for their system, but deleting parts for more functionality is very much their ethos.

 

Crissa

Well-known member
First Name
Crissa
Joined
Jul 8, 2020
Threads
96
Messages
11,271
Reaction score
18,478
Location
Santa Cruz
Vehicles
2014 Zero S, 2013 Mazda 3
Country flag
Anyone who does Astrophotography knows that cameras have filters in them to get rid of Infrared and Ultraviolet light on the spectrum. Just little bits of plastic/glass in every camera so they can more accurately represent what humans see for their holiday snaps. But the CCD is perfectly capable of seeing that extra part of the spectrum.

Tesla could easily delete that part and get more information in camera. I'm certainly not enough of an expert to know whether or not that information would be noise or useful data for their system, but deleting parts for more functionality is very much their ethos.
Yes, Tesla does that. Elon has referred to it before. Your description is better than his, of course.



-Crissa
 
OP
OP
damnitjim

damnitjim

Well-known member
First Name
Bones
Joined
Oct 21, 2020
Threads
6
Messages
138
Reaction score
230
Location
Massachusetts
Vehicles
Audi Q7 S-Line, Audi Q5 PHEV S-Line, CT3 Pre-Order
Occupation
AV Field Engineer
Country flag
😊It is also important to only add a slightly advanced technology as to sustain an appetite for more demand in the future. Otherwise, a company may put themselves out of business. Planned obsolescence is the key to any business.
Sounds like you are a fellow iPhone user... lol
 
OP
OP
damnitjim

damnitjim

Well-known member
First Name
Bones
Joined
Oct 21, 2020
Threads
6
Messages
138
Reaction score
230
Location
Massachusetts
Vehicles
Audi Q7 S-Line, Audi Q5 PHEV S-Line, CT3 Pre-Order
Occupation
AV Field Engineer
Country flag
Yes, Tesla does that. Elon has referred to it before. Your description is better than his, of course.



-Crissa
While I understand what Elon is saying, I think of it like hallucinating.. You see things until you reach out to touch it but nothing is there. (Not that I have any experience there, hahaha)
 


LoneWolfO6

Well-known member
First Name
T
Joined
May 27, 2020
Threads
5
Messages
210
Reaction score
176
Location
America
Vehicles
2018 Model 3
Occupation
Retired Military
Country flag
I drive FSD Beta every day in my Model Y.

As others have said, the Beta just mentions that it's performance may be degraded. That's the worst I've seen during heavy rain from the Beta. Nav on Autopilot just disables but keeps lane centering and follow distance.

I have had the Autopilot hand over control to me (I was on city streets, so not really Autopilot's domain), it wasn't raining, just something caused it to say "Nope!", it goes nuts, flashing warnings etc. It's very rare though. I don't recall the Beta ever doing that.

I like Tesla's iterative approach. If they can make FSD work to better than human level in perfect weather, and about human, or say slightly better than human in poor weather then that's a huge win. To drive super human in poor weather it may take additional sensors or maybe better cameras... not sure, but I can see how that's not a priority right now.
Yep, was driving and which ever camera is facing the sun, also spots an error that visibility is limited and Autopilot cannot activate?! Examples…

FD94A006-9AD4-4B38-AECC-AE1EE08F9CA9.jpeg


74658D86-A388-4A8C-9000-C3CEE155746A.jpeg
 

HaulingAss

Well-known member
First Name
Mike
Joined
Oct 3, 2020
Threads
4
Messages
1,522
Reaction score
2,869
Location
Washington State
Vehicles
2010 Ford F-150, 2018 Tesla Model 3 Performance
Country flag
And let's talk about that California coastal fog......
Last night I drove about 20 miles in moderately heavy fog and Tesla FSD did great. It can see better than a human in fog because it can boost the contrast of images digitally.

LIDAR is another story. Once the fog crosses the threshold of the LIDAR system, it can't do it. This point is well below the point at which Tesla's vision only system cuts out.
 

HaulingAss

Well-known member
First Name
Mike
Joined
Oct 3, 2020
Threads
4
Messages
1,522
Reaction score
2,869
Location
Washington State
Vehicles
2010 Ford F-150, 2018 Tesla Model 3 Performance
Country flag
I agree. I think the main reason Tesla doesn’t want to include radar is cost.
That's false. The problem of using radar to supplement visionis not cost, it's the dilema of what the FSD computer should do when the radar and the camera do not agree. Humans don't rely on radar either, because we have eyes.

Have you ever heard of the expression, "Blind as a Bat"? Bats use radar but have no vision.
 

firsttruck

Well-known member
Joined
Sep 25, 2020
Threads
98
Messages
1,767
Reaction score
2,638
Location
mx
Vehicles
none
Country flag
That's false. The problem of using radar to supplement visionis not cost, it's the dilema of what the FSD computer should do when the radar and the camera do not agree. Humans don't rely on radar either, because we have eyes.

Have you ever heard of the expression, "Blind as a Bat"? Bats use radar but have no vision.

Yup, conflict of interpretation of the scene between vision and radar.

Also bats don't have to read signs, color of traffic lights, lane lines


Elon said if there was low cost super high resolution radar there might not be so many conflicts.
So when low cost super high resolution radar possibly could be useful in the future.

Sandy Munro thinks FLIR should work today but I wonder if the FLIR resolution today is high enough. His experience is primarily missiles.
A missile that is going to blow a 20 meter (65ft) hole could be off 50 cm (20in) without issue but car needs higher accuracy, 2 cm (.7in).
 
Last edited:

Crissa

Well-known member
First Name
Crissa
Joined
Jul 8, 2020
Threads
96
Messages
11,271
Reaction score
18,478
Location
Santa Cruz
Vehicles
2014 Zero S, 2013 Mazda 3
Country flag
That's false. The problem of using radar to supplement visionis not cost, it's the dilema of what the FSD computer should do when the radar and the camera do not agree. Humans don't rely on radar either, because we have eyes.

Have you ever heard of the expression, "Blind as a Bat"? Bats use radar but have no vision.
Bats do have vision, they just use it for close-in or wide-angle information, not detailed narrow focus. (Which is echolocation, kind of like radar, but ultrasonic)

But your point stands; the cost isn't money, but time - processing time. Humans can't validate raw radar data, either, so we're no help to teaching the computer how to identify things using the radar.

-Crissa
 


John K

Well-known member
First Name
John
Joined
Jan 2, 2020
Threads
34
Messages
2,390
Reaction score
4,703
Location
Los Angeles
Vehicles
Volt, CT reserve day 2
Country flag
I am reiterating a question from the past.

given that we do not have full autonomous driving, should we be driving or driving at speeds in conditions beyond our human eyes an reflex capabilities?

Drivers are supposed to be the overriding decision makers.

we can describe edge cases for the need of LiDAR, but will LiDAR see a B2 bomber skimming the surface in front of me?

Regard ing edge cases. I encountered on driving to a Thanksgiving dinner.

Freeway exit under construction, no workers and left side lined with temporary concrete barriers. A large broken freeway sing with two posts propped upside down on the other side of the barrier. The posts were darkened and the left side hung over and partially entered the exit lane at a slight curve in the exit ramp to the right. The time of day cast a shadow camouflaging the post.

if I drive on the left side within my lane, I would be impaled catching the tip of the post which was difficult to see.

As a semi-intelligent human being, seeing the sign leaning in the position off to the side triggered caution and deeper inspection which made discerning the posts easier.

If I were driving an autonomous vehicle and an accident occurred, general populace would write it off as an edge case unless they have a vested interest one direction or the other. Because this is my edge case, I would like to think I would not have become complacent and would be able to override this scenario.

Why do some think the problem cannot be solved without LiDAR to mimic human driving capabilities?
 

tidmutt

Well-known member
First Name
Daniel
Joined
Feb 25, 2020
Threads
4
Messages
432
Reaction score
698
Location
Somewhere hot and humid
Vehicles
Model Y Performance, Model X P100D
Occupation
Software Architect
Country flag
Yep, was driving and which ever camera is facing the sun, also spots an error that visibility is limited and Autopilot cannot activate?! Examples…

FD94A006-9AD4-4B38-AECC-AE1EE08F9CA9.jpeg


74658D86-A388-4A8C-9000-C3CEE155746A.jpeg
Yeah, it's definitely a thing. Been waiting for it to happen to me. I do wonder how you deal with that in a vision system? We humans use a sun visor, sunglasses, but there are times when I'm effectively blinded and have to creep, shift positions until I can continue safely. It's rare, but it's happened.
 

slomobile

Well-known member
First Name
Dustin
Joined
Apr 7, 2022
Threads
2
Messages
108
Reaction score
104
Location
Memphis
Vehicles
Cybertruck
Occupation
Roboticist
Not many people are willing to pay a million dollars for a Cybertruck.
True. But there are at least a dozen people that would pay a million .five each to be the only people in the world with a Cybertruck for a year. Missed opportunity Tesla.

given that we do not have full autonomous driving, should we be driving or driving at speeds in conditions beyond our human eyes an reflex capabilities?

Drivers are supposed to be the overriding decision makers.
"Full" autonomous driving will always be a moving target as new features are discovered and implemented. Everything possible has not yet been dreamed of, much less invented. The 5 autonomy levels were defined as a working framework before most features existed. It will be irrelevant and antiquated at some point. Likely when most vehicles are between level 4 and 5 and some vehicles do things not yet dreamed of.

Human drivers are not always the overriding decision makers. Daytime running lights, Antilock brakes, brake lights, gear override, speed limiter, RPM limiter, airbag deployment, seatbelt retractor locking are just a few 'dumb' systems that do not allow human override. Autonomous electronic 'drivers' are a legally recognized entity responsible for operation of the vehicle. A human 'codriver' is not a requirement for full autonomy.

The autonomous driver is not required, and should not, hand over control to a human driver in situations where the autonomous driver is more capable than the human. Which 'driver' is in control should be unambiguous and based on transparent rules. Rules for commercial passenger airline pilots, copilots, and auto pilots should be an example. https://skybrary.aero/articles/cockpit-automation-advantages-and-safety-challenges

Automation dependency is something we'll deal with on the highways soon.

With that background, lets address your question. "should we be driving or driving at speeds in conditions beyond our human eyes "

Driving Yes. At full speed, No.

If there is a single theoretical rule to optimize collision safety it would be this "Minimize difference in velocity to objects near a collision path." It works very well in swarm robotics, and in nature. Watch a flock of birds, or a school of fish move as an amorphous unit without communication or collisions. Simply turn, or accelerate to match the average of your neighbor's velocities.

But our highway system puts us nearly head on, closing speed double the speed limit. Separated by only a painted line. So most of the time, we must observe different rules to stay safe.
But if you and other drivers cannot see painted lines, the safety usually provided by rules of the road evaporates. We must revert to rule 1. Minimize difference in velocity to objects near a collision path

Neither Human or Auto can see other vehicles. And others cannot see us.
We can guess at a set of likely behaviors from other drivers. Human and Auto can both know these probabilities. We start even. Humans may have noticed trends of nearby drivers giving them an edge. Speed demon ahead or carload of church ladies? Which might pull over and which might press on?

If everyone were to instantly slam on the brakes when they lose visibility, the blind areas would very quickly become parking lots full of crashed vehicles coming from both directions on a busy highway. There simply isn't enough real estate on a busy road to park everyone. If we keep moving, slowly, we can lessen the pileup and minimize injury and damage. Humans have very few tools to do this effectively. Autonomous drivers can have much better tools.

Auto knows from its last GPS fix where it is on a map. It knows the shape of the road ahead and can dead recon its way around curves using wheel abs sensors better than a human that probably assumes the road ahead is straight.

Auto knows when it is going straight ahead, or its precise turning angle, turning rate, acceleration curves, precise duration of maneuvers down to milliseconds. A human just guesses and goes by feel.

A human can tell if it is on rumble strips or not. Auto can tell precisely when it entered rumble strips and the duration it took to fully cross over the known width of the rumble strip, and the number of rumbles counted, thus knowing the angle it is heading in relation to the rumble strip which is known parallel to the centerline.

Google and Apple know where their phones are, including the ones in the car heading at you. It is conceivable that nearby phones could communicate the presence of speed traps, road debris, AND clusters of slowed or stopped vehicles in traffic lanes. What? They already do that? Maybe they could tell Auto the driver.

Auto can use radar to detect large masses approaching from ahead or behind and make minimal course corrections to avoid them. Humans would be sitting ducks in front of the shotgun.

Auto can use sonar sensors in the bumper to inch up to the stopped vehicle in front to maximize the open area in the pileup zone if it comes down to that.

Elon was right about 'typical driving' when he said
Pure vision, especially when using explicit photon count, is much better than radar+vision, as the latter has too much ambiguity – when radar & vision disagree, it is not clear which one to believe
However, in the circumstance described above, we know that vision has already failed. Trust radar. If vision has not failed, trust vision. Simple. But this is a rare case. Why carry around expensive equipment for a rare case? And this is where people are right that it was a cost saving measure. Both are true.

 
Last edited:

 
CYBERBACKPACK
Top