Better safety through radar

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,403
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
The wavelength of LIDAR is susceptible to rain and snowflakes,
Most LIDAR use 905 or 1550nm. The lower end of the visible spectrum is 780 nm (red) thus LIDAR operates in the NIR, It should not be surprising that light at LIDAR is scattered by rain and fog droplets. But LIDAR is gated. Thus one can discriminate against rain and fog to some extent by gating out the "clutter".
Sponsored

 

rr6013

Well-known member
First Name
Rex
Joined
Apr 22, 2020
Threads
54
Messages
1,680
Reaction score
1,620
Location
Coronado Bay Panama
Website
shorttakes.substack.com
Vehicles
1997 Tahoe 2 door 4x4
Occupation
Retired software developer and heavy commercial design builder
Country flag
I'm suggesting that an autonomous vehicle driving exactly like a human would be deemed unsafe, even though millions of humans drive that way daily.

NHTSA forced Tesla to recall FSD beta due to its imitation of human "slow roll" though stop-signs, even though this caused no accident nor injury.

https://insideevs.com/news/564570/tesla-relcall-fsd-rolling-stops/
First principles, CA rolling stops at stop signed intersections are legal for RH turns in California.
Second, cars intermixed that behave differently from human operated car behavior itself creates a hazard. Rolling stops were legalized in recognition that CA is one vast developed space. Full stops if unnecessarily timeless, are useless in the case where little traffic exists. Further, full stops impeded traffic where it did. Lastly, stopped signed intersections in CA are larger, huge comparitively. Stops are signed back from pedestrian crossings necessitating rolling forward to achieve clear views in all directions before turning safely. THUS rolling stops were legalized. I get that differs from high density metropolises elsewhere.
THUS NHTSB now have ruled machine behavior that is in direct opposition to human behavior and state traffic mandate.
FSD behavior is out of Tesla hands now! Tesla customer experience(Human Factors Engineering) aka UX is mandated across industry? This shall not stand.

NHTSB essentially decree that the typewriter interface and behavior rules in the face of WYSIWYG. Apple would never have succeeded. Tesla can’t innovate in the absence of harm, foul play or malice only to be struck down for reflecting human behavior. FSD == WYSIWYG for cars.
 

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,403
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
The problem I have with active systems is there is only so much band width to work with, either by legislation, or what frequency ranges can actually accomplish the job. The more and more cars that get on the road using active systems the more noise they have to deal with and the systems become degraded just by proximity.
The signals are doubtless spread using long codes for good doppler resolution and fast chipping for good range resolution so a relatively short distance from the car their density is probably below the noise floor of the receiver. And the codes are obviously orthogonal so this problem is not a problem. It's called "code division multiplexing".
 


electricAK

Well-known member
Joined
Mar 5, 2020
Threads
8
Messages
230
Reaction score
547
Location
Haines, Alaska
Vehicles
Cybertruck dual-motor
Country flag
Tesla recently sign contract with Samsung to supply its (greater than human) high resolution photometric image sensor behind Tesla camera lenses. This is the reputedly ToF(Time of Flight) measure, invisible to the naked eye, of “photons” returned to its sensor. SO a higher level of performance exceeding human eyesight Tesla is using in HDWE4.0 and included in the Cybertruck.
I just looked into the Samsung "ToF" sensors. Turns out it is indeed just a lidar. It sends an infrared pulse and uses time of flight of the return signal to judge distance. First of all it is hilarious that Samsung doesn't just call it Lidar. That's what it is.

Secondly, it would be even more hilarious if Tesla started using this technology and *also* didn't call it Lidar. Which is probably what is going to happen. Gotta love the marketing on this.
 

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,403
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
Not LIDAR. No scanning. It’s actually radar with very good resolution in x, y, x and t.
 

CyberGus

Well-known member
First Name
Gus
Joined
May 22, 2021
Threads
67
Messages
5,810
Reaction score
19,084
Location
Austin, TX
Website
www.timeanddate.com
Vehicles
1981 DeLorean, 2024 Cybertruck
Occupation
IT Specialist
Country flag
Not LIDAR. No scanning. It’s actually radar with very good resolution in x, y, x and t.
"Scanning" is not a definitive property of either RADAR or LiDAR. The salient differentiator is the emission frequency: RADAR uses radio waves, while LiDAR employs the light spectrum.
  • RADAR is an acronym for Radio Detection and Ranging
  • LiDAR is an acronym for Light Detection and Ranging
 

CyberGus

Well-known member
First Name
Gus
Joined
May 22, 2021
Threads
67
Messages
5,810
Reaction score
19,084
Location
Austin, TX
Website
www.timeanddate.com
Vehicles
1981 DeLorean, 2024 Cybertruck
Occupation
IT Specialist
Country flag
Samsung probably markets their technology as "ToF" rather than "LiDAR" because the latter implies a device that gathers data, while the purpose of the former is to create photographs.

The newest iPhones have LiDAR in the camera, and they provide an API for applications to leverage the raw data.
 

Zabhawkin

Well-known member
Joined
Sep 1, 2021
Threads
11
Messages
323
Reaction score
529
Location
New Mexico
Vehicles
1999 Nissan Frontier, 2015 F-150, 1984 Jeep CJ7
Country flag
The first RADAR systems didn't scan, then someone got the bright idea to put it on a spinning directional antenna which gave them a rough idea of direction as well as distance. With the modern phased array antennas and a lot of black magic and voodo signal processing you get a lot more.
 


slomobile

Well-known member
First Name
Dustin
Joined
Apr 7, 2022
Threads
2
Messages
108
Reaction score
104
Location
Memphis
Vehicles
Cybertruck
Occupation
Roboticist
Country flag
There have been about a dozen times in 3 decades of driving, at highway speed my forward visibility suddenly reduced to the point I could no longer see my hood ornament due to weather.

It is extremely dangerous and frightening due to the ambiguity of the situation. You simply cannot know what the right thing to do is.

If it has never happened to you, you may assume the safest thing is to pull over and stop. If you are lucky, you only hear the crashes around you. More likely you are the cause of them.
You use your best judgement, but if you cannot see the road, or the results of your steering and braking, you could stop anywhere. Including sideways across multiple lanes.

It used to be that you could not tell when to stop pulling over until you were in the ditch. Now at least, the rumble strips carved into the shoulder line give an indication. Pull right until you hear the rumble strips twice, then stop, right? Did the driver in front of you already do that? Will the driver behind? The rumble strips help ensure that in sudden blindness, multiple lanes of traffic converge into one. All lined up to collide with one another on the marble strewn shoulder.

It may surprise you to know that the official recommendation from multiple highway departments is to slow down slightly, but continue driving straight ahead while unable to see in these situations. Have the blind drivers surrounding you heard that advice? Does doubt cause you to tap the brakes when you should not?

This happens in areas where there is a sudden change in temperature, humidity, or density of wind carried material. These dangerous phenomena are temporary and localized to a relatively small area. "If you are going through hell, keep going."

When a Tesla visual system suddenly goes blind at highway speed, whether it is actively guiding the vehicle or not, it is a clear and present signal of an emergency situation. happening right now. Every bit as clear and reliable as the sensors which trigger airbags.

If your vehicle contains a legally recognized artificial driver, is that artificial driver legally culpable for negligence if it has notice and the means to successfully intervene and does not? Its an open question. I do not have an answer.

Radar, even when it has reduced performance during precipitation, combined with autopilot functionality, provides a sufficient means to navigate through these brief sudden loss of visibility emergencies.

The removal of radar could potentially have been an effort to limit liability. Adding it back is a good thing IMO.

I think our equipped vehicles should be given authority to activate radar based autopilot in a loss of visibility emergency. Perhaps even taking control for up to 10 seconds without driver acknowledgement. 10 seconds is usually more than enough time to regain a limited amount of visibility, and this is not the time to pester drivers for acknowledgement.
 

slomobile

Well-known member
First Name
Dustin
Joined
Apr 7, 2022
Threads
2
Messages
108
Reaction score
104
Location
Memphis
Vehicles
Cybertruck
Occupation
Roboticist
Country flag
Back in the mid 80s I saw a demonstration of Lidar that ... could see objects at various distances through a sand storm or any other particle storm such as fog or rain!
In 2012 I saw a similar demonstration at an IEEE symposium in Daytona FL using low power phased array radar and convolutional neural networks that could reliably range and differentiate between humans, foliage, and animals of various sizes including deer, moose, bear, skunk behind all manner of barriers. They were marketing it for border security and monitoring troop movements.
Similar tech could isolate drivers inside and outside vehicles at the roadside or in traffic.
 
Last edited:

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,403
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
"Scanning" is not a definitive property of either RADAR or LiDAR. The salient differentiator is the emission frequency: RADAR uses radio waves, while LiDAR employs the light spectrum.
There are radars that scan and phased array radars that don't. I've never seen or heard of a LIDAR that doesn't scan but perhaps there are some.

The ToF cameras are not really radars in fact. They evidently use a pulsed NIR illuminator and measure the time of flight to each pixel. The lens is equivalent to a phased array (that's how lenses work) so they can measure the distance of the target within each pixel. The pixels evidentlly have the same Bayer spectral response (or equivalent) so one has visible spectrum data for image formation and the IR data for ranging. Amazing. And it adds about $80 to the cost of smart phones that incorporate it.
 

JBee

Well-known member
First Name
JB
Joined
Nov 22, 2019
Threads
18
Messages
4,752
Reaction score
6,129
Location
Australia
Vehicles
Cybertruck
Occupation
. Professional Hobbyist
Country flag
You can also get a MASER that operates at radar frequencies. So technically theres no reason not to have a Midar as well... :cool:

Anyways, the interesting thing about ToF is as Adjelange said, it produces an instant per pixel range point cloud with camera image overlay all at the same time per frame, without scanning, time referencing, all from a global shutter camera chip that costs $10's of dollars.

That means the camera is essentially taking 3D photos with one lens, and the photos don't need processing to create a range point cloud to work out distances to objects, neither has to compare frames for depth, so a lot of the processing required to recreate the environment in "digital" so that it can be labelled by AI, is done on the camera itself.

Imagine if you had "Megapixels" of rangefinders, that all measure at the same time, that can also overlay image data as well. Maybe this picture helps visualise what it sees per frame:

Tesla Cybertruck Better safety through radar 220px-TOF_Kamera_3D_Gesicht



Apparently, Sony originally had these but Samsung is doing their own now.

As for Radar on Teslas, I wouldn't put it past them using a variation of Starlink Dishy tech on the their cars as well. Beam forming radar to look around corners and traffic would be pretty epic, like they used to recieve and use signals from under the car in front of you. 😋

There's another radar thread on here where similar things were discussed. I mentioned this there already in regards to active vs passive systems, and that is my belief that using SDR in a passive "listening" only mode has merit. SDR passive radar development is coming along quite nicely, especially since hardware costs have plummeted over the last few years, with hardware available under $100.

The advantage of a passive system is that every existing radio source also becomes a potential source of information and as such, location and environment can be observed in the RF space, as well as in the visual and acoustic ones.

With that we'll be like Superman when we drive our Teslas... or was that Iron Man? :cool::rolleyes::ROFLMAO:
 
 




Top