Latest Emergency Vehicle Collision in Houston

Ogre

Well-known member
First Name
Dennis
Joined
Jul 3, 2021
Threads
164
Messages
10,719
Reaction score
26,998
Location
Ogregon
Vehicles
Model Y
Country flag
But as soon as there was a bend in the INTERSTATE, the car chimes and shuts off Blue Cruise.
FWIW, Autopilot does this as well, though only on **much** tighter curves on 2 lane roads. I've had it happen on some windy 15-25 MPH curves. Generally its so awkward on those kinds of curves I just take over regardless.

Blue Cruise is clearly way more picky about this. The fact that it still struggles when it's limited to 130,000 miles is kind of weak.

But the auto-journalists go ape because "It's true hands free".
Sponsored

 

HaulingAss

Well-known member
First Name
Mike
Joined
Oct 3, 2020
Threads
9
Messages
4,495
Reaction score
9,476
Location
Washington State
Vehicles
2010 F-150, 2018 Model 3 Perform, FS Cybertruck
Country flag
#2 The drivers were inattentive and Autopilot just screwed the pooch. The driver is still at fault, but this type of accident concerns me because it's exactly the sort of thing I feel autopilot should *never do*. If I have a heart attack or fall asleep at the wheel, I should be able to rely on Autopilot to do something reasonably safe.

There have been past autopilot failures where the driver was merely inattentive or in one case in control but assumed autopilot would dodge something and didn't. That's why this concerns me.
My point was that when all these issues are solved it won't be merely Autopilot, it will be Full-Self Driving. In other words, we can expect it to not be able to handle everything without driver intervention until it can.

If what's concerning to you that it seems like a very simple problem to solve, I would suggest that is not necessarily the case. Tesla Autonomy Team been focusing on complex urban environments for some time (at the expense of simple highway driving). I think Tesla temporarily suspended development of Freeway capabilities when it reached the state of being a very comfortable and easy to use driver's aid (as long as the driver kept his/her attention down the road). The problem with parked objects in your lane is it requires a complex series of decisions as to whether it should brake instantly or can simply slow a little and drive around it, perhaps in an adjacent lane. Even when it becomes safer than a human driver, statistically speaking, it will have more unnecessary sudden braking events than a human would have. These will be extremely unnerving for a lot of people until they understand it's part of how the system achieves a safer method than by human hands.

When FSD is ready for use without human oversight, In order to maintain a chosen level of safety that is significantly better than human, Tesla will have to decide whether to have a much higher number of early gently, precautionary braking events (that turn out to be unnecessary), a much lower number of moderate early braking events (that turn out to be unnecessary) or a fewer number of sudden and very harsh braking events (that turn out to be unnecessary).

Humans tend to avoid sudden harsh braking events unless there is no ambiguity that it's absolutely necessary. This is true even though humans would be safer drivers if they were willing to brake hard as a precautionary measure (assuming it was safe to do so without being rear-ended) even though the threat seemed uncertain. Autonomous cars will have a big advantage in that they know when it is safe to brake strongly at all times because they are constantly monitoring what is going on behind the vehicle. So, even though autonomous cars will be safer, they might not project as much confidence to passengers as a human that has an ego to protect. This is a difficult part of autonomy development to transition through and does not lend itself to incremental improvements.

My take on this is that Tesla has chosen to freeze development releases of highway threat detection where it is at currently and rely upon humans in order to avoid an excessive number of unnecessary emergency braking events until the system is more developed. In other words, some development stages have awkward and unsafe development transitions and it's best to continue to rely on humans for that rather than having excessive emergency braking events and still not being quite as safe as a good human driver.

None of this concerns me, Tesla is merely making rational decisions that will allow the quickest and safest transition to autonomy without causing people to lose confidence in the idea of autonomy. The problem here is the small percentage of drivers who don't pay very close attention as they barreling down the highway at 70 mph (regardless of whether they have Autopilot or not). It's unreasonable to assume that just because Tesla says a human must be in control at all times that suddenly people will be even more attentive than without Autopilot. Because these type of accidents happen with shocking regularity, autopilot is not the problem, not paying attention is.
 

HaulingAss

Well-known member
First Name
Mike
Joined
Oct 3, 2020
Threads
9
Messages
4,495
Reaction score
9,476
Location
Washington State
Vehicles
2010 F-150, 2018 Model 3 Perform, FS Cybertruck
Country flag
FWIW, Autopilot does this as well, though only on **much** tighter curves on 2 lane roads. I've had it happen on some windy 15-25 MPH curves. Generally its so awkward on those kinds of curves I just take over regardless.

Blue Cruise is clearly way more picky about this. The fact that it still struggles when it's limited to 130,000 miles is kind of weak.

But the auto-journalists go ape because "It's true hands free".
I don't see how you can even compare these two things because Blue Cruise is shutting off on regular curves on Blue Cruise enabled Freeways that regular drivers will take at 75 mph without even easing off the throttle.

In comparison, I can drive a rural, two-lane State Highway and Autopilot will slow down from 55 mph to 40 mph to take tighter curves. It will generally slow down to about the same speed a typical human might on the same corner. I agree, it doesn't do it as gracefully as a human, at least not consistently, but Autopilot is not even approved by Tesla for use on undivided two lane rural highways. With Blue Cruise we are talking about a regular limited access freeway that is Blue Cruise enabled/approved.
 


Crissa

Well-known member
First Name
Crissa
Joined
Jul 8, 2020
Threads
126
Messages
16,227
Reaction score
27,092
Location
Santa Cruz
Vehicles
2014 Zero S, 2013 Mazda 3
Country flag
Yes, with traffic-aware control I found that with my mom's Subaru that if you began a panic stop it would disengage - sometimes making the stopping distance longer than necessary - but if you tried to avoid the alert and power through it, it would panic and begin braking again. My mom didn't lik the former and I didn't like the latter.

I really wanted a button or tap on the cruise-control stick to tell it I was taking over or to pause control for a moment. Many of these other Level 2 options do not handle this handoff to human control gracefully.

-Crissa
 

Ogre

Well-known member
First Name
Dennis
Joined
Jul 3, 2021
Threads
164
Messages
10,719
Reaction score
26,998
Location
Ogregon
Vehicles
Model Y
Country flag
My point was that when all these issues are solved it won't be merely Autopilot, it will be Full-Self Driving. In other words, we can expect it to not be able to handle everything without driver intervention until it can.
I am reasonably happy with the current state of Autopilot. Regardless of whether it is hitting cars on the side of the road or not. My point is more that I want to know what the limits of the technology are. And secondarily I want Tesla to aspire to Autopilot *NOT HITTING STATIONARY OBJECTS*. Maybe it's not there now, I accept that. But it should be there eventually.

Also... no double standards here. Everyone's autopilot/ Blue Cruise/ Copilot/ whatever should be held to the same standard.



I don't see how you can even compare these two things because Blue Cruise is shutting off on regular curves on Blue Cruise enabled Freeways that regular drivers will take at 75 mph without even easing off the throttle.

In comparison, I can drive a rural, two-lane State Highway and Autopilot will slow down from 55 mph to 40 mph to take tighter curves. It will generally slow down to about the same speed a typical human might on the same corner. I agree, it doesn't do it as gracefully as a human, at least not consistently, but Autopilot is not even approved by Tesla for use on undivided two lane rural highways. With Blue Cruise we are talking about a regular limited access freeway that is Blue Cruise enabled/approved.
Sure... that's kind of the whole point of my comment. There are limits to Autopilot, but they are way way beyond those Blue Cruise has.

Also, I've seen you comment previously about where autopilot is approved for use and it doesn't mesh with my understanding of the feature. This is the documentation I see: "Autosteer: Assists in steering within a clearly marked lane, and uses traffic-aware cruise control". No limits on undivided highways or in/ out of city.

My understanding and IMO the most reasonable understanding is it is designed to be used any time you can activate the feature. If it were otherwise, what is the point of making it available when it's not an approved use? Seems like a recipe for confusion.
 

tidmutt

Well-known member
First Name
Daniel
Joined
Feb 25, 2020
Threads
8
Messages
603
Reaction score
992
Location
Somewhere hot and humid
Vehicles
Model Y Performance, Model X P100D
Occupation
Software Architect
Country flag
It's exceedingly difficult to measure things that don't happen.

There is a lot of selection bias involved in any Autopilot/ FSD numbers as well. Lots of people only engage Autopilot during relatively "Easy" traffic conditions. Also, Any time you have Autopilot engaged, you always have a safety driver. If the safety driver disengages and prevents an accident, it doesn't get counted against autopilot.

I do think autopilot and FSD are already saving lives. I just think most numbers around it are highly questionable.

I'm also super interested in autopilot because I'm getting older. A little behind you, but close enough where it's a concern. Also, when you spend a week over-exerting yourself then have to drive home like I did this last week, having a little assistance is much appreciated.
You could argue that if a human and FSD together result in a X times safer driving experience then it doesn't really matter. Whether the FSD or the combo of human and FSD that gets the credit is somewhere irrelevant, except to those developing FSD of course.

Now I would like to see the data on this from Tesla independently verified. Not that I think Tesla would outright lie about it but they certainly could spin it.
 

Ogre

Well-known member
First Name
Dennis
Joined
Jul 3, 2021
Threads
164
Messages
10,719
Reaction score
26,998
Location
Ogregon
Vehicles
Model Y
Country flag
You could argue that if a human and FSD together result in a X times safer driving experience then it doesn't really matter.
I was thinking something similar. Establish a correlation between higher total FSD and/ or Autopilot use and fewer accidents. Obviously this would need to be spread over 1000s of drivers. Not perfect but better than I think what Tesla currently reports.

Not that I think Tesla would outright lie about it but they certainly could spin it.
In my cynical eye, Tesla does "Spin" things a fair bit. Range is my pet peeve, but I think their stats on FSD safety are of limited usefulness and benefit Tesla.

It's hard to be cynical and a fan... but here I am.
 

tidmutt

Well-known member
First Name
Daniel
Joined
Feb 25, 2020
Threads
8
Messages
603
Reaction score
992
Location
Somewhere hot and humid
Vehicles
Model Y Performance, Model X P100D
Occupation
Software Architect
Country flag
I was thinking something similar. Establish a correlation between higher total FSD and/ or Autopilot use and fewer accidents. Obviously this would need to be spread over 1000s of drivers. Not perfect but better than I think what Tesla currently reports.


In my cynical eye, Tesla does "Spin" things a fair bit. Range is my pet peeve, but I think their stats on FSD safety are of limited usefulness and benefit Tesla.

It's hard to be cynical and a fan... but here I am.
I think it's wise to temper any fandom with some cynicism. After all, they exist in a world where spin is a common denominator so it's to be expected.

To be honest I would be disappointed if they didn't spin things because it helps achieve their mission which is something I believe in. I suppose you could say that I think the ends justifies the means in this case because I suspect FSD IS safer when used properly and will ultimately be MUCH safer. If a little spin gets us there then so be it.
 


CyberG

Well-known member
First Name
Geoff
Joined
Dec 13, 2019
Threads
0
Messages
61
Reaction score
112
Location
91326
Vehicles
2013 P85+
Occupation
Attorney
Country flag
Several of the 11 incidents involved drunk driving.

There is also a minority of Tesla drivers who feel they can turn on autopilot and punch out from the world, playing video games, watching movies, reading books, napping, etc. I'm pretty sure it's a minority anyhow.
I recall one of the early crashes on autopilot involved a driver watching a movie on his IPad. And he was watching a Harry Potter movie, just to make him look that much dumber.

St Peter: how did you die?

driver: watchin a movie while driving

St. Peter: what movie?

driver: (hangs head)…
 

Ogre

Well-known member
First Name
Dennis
Joined
Jul 3, 2021
Threads
164
Messages
10,719
Reaction score
26,998
Location
Ogregon
Vehicles
Model Y
Country flag
As a software developer, test suites are common. The test suite includes any and all expected behaviors the program will encounter. When the input is as broad as *reality*, it's impossible to develop an exhaustive test suite. Instead you try and build tests around all expected inputs and perhaps the worse case scenarios.

It's a common practice when you encounter a bug or edge case to add that to your test suite.

Every time you change your software and rebuild it, you run the test suite, to ensure that all of your tests still pass. Tesla FSD code isn't quite the same as normal software, but I know they use similar methodologies to build their current tools. They spoke about creating a full simulated world comparable to a video game engine so they could create and test scenarios and train the FSD neural nets.

Presumably every time a Tesla on Autopilot or FSD is involved in a collision, it gets added to their test suite.

I doubt FSD and Autopilot are going to be separate code bases (or in this case, neural networks). Autopilot will be a less capable version of FSD, equally capable of avoiding potential collisions, but not able to change lanes, take off-ramps, etc. Having one code base makes things simpler and makes testing a lot easier. My feeling here is when "FSD 10" is launched that Autopilot is going to get a big shot in the arm in terms of safety as well.

I am very curious to see first hand how much of a difference this upgrade makes (whenever it actually rolls out).
 

HaulingAss

Well-known member
First Name
Mike
Joined
Oct 3, 2020
Threads
9
Messages
4,495
Reaction score
9,476
Location
Washington State
Vehicles
2010 F-150, 2018 Model 3 Perform, FS Cybertruck
Country flag
Also, I've seen you comment previously about where autopilot is approved for use and it doesn't mesh with my understanding of the feature. This is the documentation I see: "Autosteer: Assists in steering within a clearly marked lane, and uses traffic-aware cruise control". No limits on undivided highways or in/ out of city.

My understanding and IMO the most reasonable understanding is it is designed to be used any time you can activate the feature. If it were otherwise, what is the point of making it available when it's not an approved use? Seems like a recipe for confusion.
"Approved" might not have been the right word. In the Model 3's Owner's Manual it says this:

Tesla Cybertruck Latest Emergency Vehicle Collision in Houston 1630397469811


Of course I've been using it on all kinds of roads for three years but it does say it's intended for use on limited access highways.
 

Ogre

Well-known member
First Name
Dennis
Joined
Jul 3, 2021
Threads
164
Messages
10,719
Reaction score
26,998
Location
Ogregon
Vehicles
Model Y
Country flag
"Approved" might not have been the right word. In the Model 3's Owner's Manual it says this:

1630397469811.png


Of course I've been using it on all kinds of roads for three years but it does say it's intended for use on limited access highways.
Ahhh. Sort of like:

"We designed this hammer for driving and removing nails. If you find it useful for hitting other things, it is coincidental."

In other words don't complain if our roofing hammer is less than ideal for smashing nuts.
 
OP
OP
rr6013

rr6013

Well-known member
First Name
Rex
Joined
Apr 22, 2020
Threads
54
Messages
1,680
Reaction score
1,620
Location
Coronado Bay Panama
Website
shorttakes.substack.com
Vehicles
1997 Tahoe 2 door 4x4
Occupation
Retired software developer and heavy commercial design builder
Country flag
As a software developer, test suites are common. The test suite includes any and all expected behaviors the program will encounter. When the input is as broad as *reality*, it's impossible to develop an exhaustive test suite. Instead you try and build tests around all expected inputs and perhaps the worse case scenarios.

It's a common practice when you encounter a bug or edge case to add that to your test suite.

Every time you change your software and rebuild it, you run the test suite, to ensure that all of your tests still pass. Tesla FSD code isn't quite the same as normal software, but I know they use similar methodologies to build their current tools. They spoke about creating a full simulated world comparable to a video game engine so they could create and test scenarios and train the FSD neural nets.

Presumably every time a Tesla on Autopilot or FSD is involved in a collision, it gets added to their test suite.

I doubt FSD and Autopilot are going to be separate code bases (or in this case, neural networks). Autopilot will be a less capable version of FSD, equally capable of avoiding potential collisions, but not able to change lanes, take off-ramps, etc. Having one code base makes things simpler and makes testing a lot easier. My feeling here is when "FSD 10" is launched that Autopilot is going to get a big shot in the arm in terms of safety as well.

I am very curious to see first hand how much of a difference this upgrade makes (whenever it actually rolls out).
Stumbled into the AP/FSD code fork black hole too! A high level flyover view appears the two code bases converge ~Level5.

Deeper I dug, the more it became apparent Tesla started AP going down the incremental iterative development roadmap and the automotive industry followed them, mostly. But Tesla did a quick pivot to FSD.

It wasn’t until Jalopnik came out with a rant that FSD got autonomous driving wrong I uncovered a tell. Jalopnik and BluCruise et. al. AP-like iterative enhancement roadmap leads to non-FSD.

At core, front and center, to the auto industry development paradigm is “ the driver”! As long as the driver is in control, in the loop, these cruise technologies can assist drivers. Drivers can get better, safer and never approach accident-free. The auto industry can approach Level4. They can never jump to Level5. They can only assert that its impossible to attain.

Elon is unequivocal. Remove the human, human mistakes and technology approaches 10X or 1000% improved safety. That’s Full Self Driving(FSD). Tesla has its eye on the goal of safety, FSD and 10X improvement.

The two driving code bases(AP+FSD) will never converge – ever. They serve separate purpises.
Sponsored

 
 




Top