Why Do Teslas Keep Smashing Emergency Vehicles? (Warning: Graphic images in article)

JCERRN

Well-known member
First Name
John
Joined
May 3, 2021
Threads
6
Messages
119
Reaction score
133
Location
Massachusetts
Vehicles
Subaru Forester
Country flag
https://news.google.com/articles/CB...zMTYwOTg0Lz9hbXA9MQ?hl=en-US&gl=US&ceid=US:en

Anyone wanna venture a guess as to why (aside from the fact that the media chooses to focus on anything that can be considered a “Tesla Hit-piece”) that there have been so many incidents of autopilot veering into or close to parked emergency vehicles with their flashing lights? (The article does not say whether autopilot was engaged, but i think there is implication of such)
Sponsored

 

flowerlandfilms

Well-known member
First Name
Eryk
Joined
Dec 6, 2020
Threads
5
Messages
794
Reaction score
1,690
Location
Australia
Vehicles
Yamaha SRV-250, Honda Odyssey RB1
Occupation
Film Maker
Country flag
Most likely scenario is that people aren't paying attention and crash just like any other car. More Teslas on the road means it will happen more often and selective reporting gives the appearance it is a Tesla specific issue.

IF it turns out to be an FSD crash, and we are talking about why the neural net would fail in this particular scenario, it could be the case that there aren't enough recorded crashes of these kinds of scenarios, and it represents a gap in the data uses to train it? That's speculation on my part.

It is a grim reality but just like a person it does need to be in a few crashes before it learns how to behave in a crash.
 

ÆCIII

Well-known member
Joined
Apr 27, 2020
Threads
10
Messages
1,058
Reaction score
2,492
Location
USA
Vehicles
Model 3
Country flag
Any other brand, they would've just referred to it as a "car". To the MSM, there are cars, and then there are Teslas. In the below linked video by Sam Alexander, this is abundantly clear:




They do mention Autopilot (with convenient speculation) at the bottom of the OP's article though.

Of course we're going to hear much more about this because this is blood in the water for the MSM.

From the pictures (https://t.co/YCGn8We1bK), to me this looks like a high speed impact with no brakes applied (couldn't see any tire skid marks either), which would suggest that Autopilot was likely not engaged. From the car's mangled front end appearance, they had to be going pretty fast, and so fast that airbags couldn't save the driver.

Of course the MSM will likely get that wrong since they can later claim mistakes with no liable even if they know they were lying on purpose.

I'd be interested to know if the Automatic Emergency Braking feature was even enabled because I don't see any tire skid marks, although the cars could've been moved and there is a lot of debris around. But with ABS actuating the wheels may not have locked or skidded even if the Auto Emergency Braking was applied. It should kick in with or without Autopilot though if it's enabled, so the data to show whether that feature had been enabled or not, will be telling. I hope Tesla might release limited data to reveal what the actual speed of impact was.

I bet we won't even hear about toxicology tests results of the driver unless they're totally negative (because its a Tesla).

- ÆCIII
 
Last edited:

android04

Well-known member
Joined
Jul 28, 2020
Threads
2
Messages
316
Reaction score
614
Location
Crete, NE
Vehicles
2018 Tesla Model 3 LR RWD, Tesla Cybertruck Tri-motor (reserved)
Country flag
A lot of the incidents I've read about happened some time ago when Tesla was still using the front facing radar for AP/FSD. The initial disclaimer one consents to when first enabling AP/FSD specifically called out the limitation that the system may not correctly identify stationary objects in a driving lane. Therefore it warned that the driver must always be paying attention and be ready to take over.

The reason the radar and vision system could "miss" identifying a stationary object in the lane was due to the tricky way of combining and prioritizing the data from both sets of sensors (referred to as sensor fusion). The sensor fusion could cause the stationary object to be misidentified as a billboard or overpass and the car would ignore it and just drive into it. On the other hand, if the sensor fusion was made to be more sensitive the car would be braking for a lot of things that aren't actually in the lane (also known as phantom braking).

Tesla tried to reduce these incidents a few years back by having the car slow down automatically and flash a message on the screen if it detected flashing lights that could be from emergency response vehicles. Also, Tesla got rid of the radar module and disabled the existing radar module in cars with FSD Beta to eliminate the harder work of sensor fusion. Not sure if these have helped reduce or eliminate the instances of Teslas on AP/FSD rear ending emergency response vehicles.

This new incident in the OP might suggest that the incidence hasn't been completely eliminated, but we also don't know what hardware and year the Model S is. It looks like an older Model S to me, and therefore might not have had a new enough AP computer to benefit from software updates related to this. It may not even have had AP hardware (early Model S didn't have any autonomous capabilities), may have been a repaired vehicle or a jailbroken vehicle that didn't have the latest software updates. It also could have been a failure of Tesla's AP/FSD system, and the driver wasn't paying attention like he should have to be able to correct it.
 
OP
OP

JCERRN

Well-known member
First Name
John
Joined
May 3, 2021
Threads
6
Messages
119
Reaction score
133
Location
Massachusetts
Vehicles
Subaru Forester
Country flag
Most likely scenario is that people aren't paying attention and crash just like any other car. More Teslas on the road means it will happen more often and selective reporting gives the appearance it is a Tesla specific issue.

IF it turns out to be an FSD crash, and we are talking about why the neural net would fail in this particular scenario, it could be the case that there aren't enough recorded crashes of these kinds of scenarios, and it represents a gap in the data uses to train it? That's speculation on my part.

It is a grim reality but just like a person it does need to be in a few crashes before it learns how to behave in a crash.
right. I mean people are supposed to be paying attention when AP is engaged, and be ready to take over at any time. Shame because this comes on the heels of “MassivE FSD “RECALL””
 


OP
OP

JCERRN

Well-known member
First Name
John
Joined
May 3, 2021
Threads
6
Messages
119
Reaction score
133
Location
Massachusetts
Vehicles
Subaru Forester
Country flag
A lot of the incidents I've read about happened some time ago when Tesla was still using the front facing radar for AP/FSD. The initial disclaimer one consents to when first enabling AP/FSD specifically called out the limitation that the system may not correctly identify stationary objects in a driving lane. Therefore it warned that the driver must always be paying attention and be ready to take over.

The reason the radar and vision system could "miss" identifying a stationary object in the lane was due to the tricky way of combining and prioritizing the data from both sets of sensors (referred to as sensor fusion). The sensor fusion could cause the stationary object to be misidentified as a billboard or overpass and the car would ignore it and just drive into it. On the other hand, if the sensor fusion was made to be more sensitive the car would be braking for a lot of things that aren't actually in the lane (also known as phantom braking).

Tesla tried to reduce these incidents a few years back by having the car slow down automatically and flash a message on the screen if it detected flashing lights that could be from emergency response vehicles. Also, Tesla got rid of the radar module and disabled the existing radar module in cars with FSD Beta to eliminate the harder work of sensor fusion. Not sure if these have helped reduce or eliminate the instances of Teslas on AP/FSD rear ending emergency response vehicles.

This new incident in the OP might suggest that the incidence hasn't been completely eliminated, but we also don't know what hardware and year the Model S is. It looks like an older Model S to me, and therefore might not have had a new enough AP computer to benefit from software updates related to this. It may not even have had AP hardware (early Model S didn't have any autonomous capabilities), may have been a repaired vehicle or a jailbroken vehicle that didn't have the latest software updates. It also could have been a failure of Tesla's AP/FSD system, and the driver wasn't paying attention like he should have to be able to correct it.
Either way, still the driver’s fault, not the cars.
 

BillyGee

Well-known member
First Name
Bill
Joined
Jan 22, 2020
Threads
8
Messages
708
Reaction score
1,534
Location
Northern California
Vehicles
Model Y P, Model 3 LR, Founders CT (Ordered)
Occupation
Technician
Country flag
Considering how badly my car flips out on autopilot when it sees anything that can be vaguely interpreted as emergency lights on either side of the road, I highly doubt AP was engaged when these happened. This is probably just clickbait and Tesla bashing.

My model Y AP slammed the brakes and beeped wildly at a flashing light on the back of a cyclist's helmet once, good times.
 

Crissa

Well-known member
First Name
Crissa
Joined
Jul 8, 2020
Threads
127
Messages
16,612
Reaction score
27,655
Location
Santa Cruz
Vehicles
2014 Zero S, 2013 Mazda 3
Country flag
It was in the original report.




CalFire always says the brand when it's visible, and always reports these collisions, if only to get people to not do them. Looks like the vehicle was seriously speeding.

-Crissa
 

Diehard

Well-known member
First Name
D
Joined
Dec 5, 2020
Threads
23
Messages
2,127
Reaction score
4,248
Location
U.S.A.
Vehicles
Olds Aurora V8, Saturn Sky redline, Lightning, CT2
Country flag
It looks like passenger may survive so they may be able to clarify if FSD or autopilot was engaged.

In this case, regardless of if it FSD was engaged, I think it was very likely the driver was doing something irresponsible (hard to miss those flashing lights especially if it was at night). That said, in general I don’t quite understand the argument that driver should have payed attention when FSD was engaged. Isn’t the whole point of FSD is being able to pay less attention? Even if you have signed some sort of release or agreement that you will pay attention and are responsible, wouldn’t Paying less attention be a natural byproduct of FSD? If any of you have been selected to test FSD beta, and can be honest here (since you are not using your real name), please share how much attention you paid throughout (the beginning, middle and end of) testing period in comparison to when you drove a vehicle with no driver assistance.

For me it has always been a binary thing. Either I can trust it or I can’t. Anything in between, I rather it is someone else testing it on a road I am not using.
 
Last edited:

Zabhawkin

Well-known member
Joined
Sep 1, 2021
Threads
11
Messages
323
Reaction score
529
Location
New Mexico
Vehicles
1999 Nissan Frontier, 2015 F-150, 1984 Jeep CJ7
Country flag
180 people died in crashes with emergency vehicles in 2020. There are approximately 6500 crashes involving ambulances each year. Tesla has had 14 total crashes with emergency vehicles.

I remember years ago before Tesla was a thing discussing several fatal accidents involving stopped police cars. Back then it was people not paying attention, and the fact that police cars were not designed to take a hit from behind at 80+ mph which would cause the fuel tank to rupture.

But hey lets blame the software that most likely wasn't in use, and even if it was the person behind the wheel wasn't paying attention.

In Albuquerque they have a large brightly colored truck that can lower a crash barrier to lessen the impact. I don't know if its been hit yet or not, but it patrols the freeways making safe spots for emergency crews and disabled vehicles.
 


ÆCIII

Well-known member
Joined
Apr 27, 2020
Threads
10
Messages
1,058
Reaction score
2,492
Location
USA
Vehicles
Model 3
Country flag
It looks like passenger may survive so they may be able to clarify if FSD or autopilot was engaged.

In this case, regardless of if it FSD was engaged, I think it was very likely the driver was doing something irresponsible (hard to miss those flashing lights especially if it was at night). That said, in general I don’t quite understand the argument that driver should have payed attention when FSD was engaged. Isn’t the whole point of FSD is being able to pay less attention? Even if you have signed some sort of release or agreement that you will pay attention and are responsible, wouldn’t Paying less attention be a natural byproduct of FSD? If any of you have been selected to test FSD beta, and can be honest here (since you are not using your real name), please share how much attention you paid throughout (the beginning, middle and end of) testing period in comparison to when you drove a vehicle with no driver assistance.

For me it has always been a binary thing. Either I can trust it or I can’t. Anything in between, I rather it is someone else testing it on a road I am not using.
Are you really serious?

Everyone enabling FSD is prompted to acknowledge that they still must pay attention. They have to press the acknowledgment button stating they understand these responsibilities. Here is the screen prompt everyone must acknowledge when enabling FSD Beta:

Tesla Cybertruck Why Do Teslas Keep Smashing Emergency Vehicles? (Warning: Graphic images in article) 1676840766326


If FSD was not in this early limited access Beta development phase, your question might be valid, but everyone should be sincere when acknowledging this prompt. If one does not pay attention to what they read and accept, then they're likely frought with other problems in life in addition to being unsuitable for testing FSD Beta.

Because today's pushed social cultural norms often have people impatiently inattentive and in an emotionally immature state, it may be a good idea for Tesla to make the FSD Enabling prompt much more in depth such that one must read, and then answer like ten questions of a quiz afterward, before they can be prompted with the actual acknowledgment, to assure they "get" or have assimilated the content and have consciously accepted their responsibilities.

But even without such an extensive quiz before a prompt, Tesla has their legal responsibilities covered with the above prompt currently in use, and Tesla should not have to babysit those who are too impatient to read the content they are acknowledging.

- ÆCIII
 
Last edited:

Bakeram1

New member
First Name
Andy
Joined
Aug 9, 2021
Threads
0
Messages
3
Reaction score
8
Location
Hollywood, Maryland
Vehicles
2004 F-150, VW Passat
Occupation
Engineer
Country flag
With a vision-based autopilot, I’m wondering if the cameras reduce their gain when the strobe lights flash, and therefore are not able to make out the rest of the scene between flashes. Essentially the cameras are blinded by the flash.
 

ÆCIII

Well-known member
Joined
Apr 27, 2020
Threads
10
Messages
1,058
Reaction score
2,492
Location
USA
Vehicles
Model 3
Country flag
With a vision-based autopilot, I’m wondering if the cameras reduce their gain when the strobe lights flash, and therefore are not able to make out the rest of the scene between flashes. Essentially the cameras are blinded by the flash.
This is not even a valid question yet because we don't even know if Autopilot was engaged. But even if it was, I'm very sure Tesla engineers were exhaustively thorough in evaluation of dynamic range performance in cameras they were choosing for their FSD hardware suite.

Regardless of all that, in the post #11 above each driver acknowledges their responsibilities to always pay attention even if FSD had been engaged.

-ÆCIII
 

Diehard

Well-known member
First Name
D
Joined
Dec 5, 2020
Threads
23
Messages
2,127
Reaction score
4,248
Location
U.S.A.
Vehicles
Olds Aurora V8, Saturn Sky redline, Lightning, CT2
Country flag
Are you really serious?
I am really serious. Your response didn’t clear up anything for me. I completely understand Tesla’s point of view. No need for clarification there. If I was in Elon’s shoes and someone came to me and said I don’t have to hire and pay someone for testing and they put their own life on the line for free to do the testing for me and I didn’t have to use my own hardware, instead they pay me for a car and use it for testing then they pay me $15K+ for the software after release, I would jump on that as well. That makes perfect sense.

What I don’t get is why the owners do it and also curious if they can really do what they said they would and stay fully attentive when they are sharing the kitchen with another cook (FSD) 100% of the time. If that was your own screenshot you shared, the following would answer my question:

Why did you click on yes?

Do you or have you ever owned TSLA (donating your resources to protect your investment makes a bit of sense).

Were you able to pay attention the entire time it was enabled as much as you would in a vehicle without driver assistance or there were occasions that it was doing such a good job you found yourself enjoying the scenery?
 

ÆCIII

Well-known member
Joined
Apr 27, 2020
Threads
10
Messages
1,058
Reaction score
2,492
Location
USA
Vehicles
Model 3
Country flag
I am really serious. Your response didn’t clear up anything for me. I completely understand Tesla’s point of view. No need for clarification there. If I was in Elon’s shoes and someone came to me and said I don’t have to hire and pay someone for testing and they put their own life on the line for free to do the testing for me and I didn’t have to use my own hardware, instead they pay me for a car and use it for testing then they pay me $15K+ for the software after release, I would jump on that as well. That makes perfect sense.

What I don’t get is why the owners do it and also curious if they can really do what they said they would and stay fully attentive when they are sharing the kitchen with another cook (FSD) 100% of the time. If that was your own screenshot you shared, the following would answer my question:

Why did you click on yes?

Do you or have you ever owned TSLA (donating your resources to protect your investment makes a bit of sense).

Were you able to pay attention the entire time it was enabled as much as you would in a vehicle without driver assistance or there were occasions that it was doing such a good job you found yourself enjoying the scenery?
The answer to your (relevant) question is a no-brainer Yes.

If you can Pay Attention without FSD engaged, or while driving in a legacy auto not having the feature, then you can certainly Pay Attention with FSD.

You engage it to help Tesla practice behavior and collect data, but you rest your hands on the wheel and Pay Attention as if it weren't engaged at all. Anyone who cannot comprehend or handle this responsibility should not be allowed to use FSD Beta.

-ÆCIII
Sponsored

 
 




Top