FSD Beta v9 v10 or any FSD Beta & children in car

firsttruck

Well-known member
Joined
Sep 25, 2020
Threads
177
Messages
2,575
Reaction score
4,103
Location
mx
Vehicles
none
Country flag
Something that has been bother me for a while is FSD drivers who make videos with children in the vehicle.

Even though driver is monitoring the car and might take over there is inherently a delay in response that in a split second situation that could make difference between there being an accident.

Another thing is the child might require attention or distract the driver just when FSD Beta makes a mistake and the human driver monitoring beta misses clues or is too slow to react.


I have seen some with a father & toddler (or infant) and other similar situations.
I stop watching any video where I find children are passengers in beta driven cars.

I consider driving a car with Beta software child endangerment.

Children can also not give consent.

Another related situation is when both spouses (or both caregivers) of children are in a beta driven vehicle. Even if children are not in car if there was an serious accident the children would lose all their primary caregivers.

I think Tesla should explicitly warn Tesla owners to not have children in car when FSD beta is being used.

I want FSD to be tested & become a finished product as soon as possible.

Another thing from Tesla business perspective is the huge bad publicity Tesla would get if there was a bad accident with a FSD Beta car that had children in the car.
Sponsored

 

JBee

Well-known member
First Name
JB
Joined
Nov 22, 2019
Threads
18
Messages
4,772
Reaction score
6,147
Location
Australia
Vehicles
Cybertruck
Occupation
. Professional Hobbyist
Country flag
Ethically I'd agree. Theres also a danger to other cars and bystanders too. But to make an risk assessment I'd need to know just how prone FSD is to errors, what the consequences of those errors are, and how the controls work to mitgate them. It would be interesting to see the FSD risk matrix. I'm pretty sure the delays so far are because they are trying to reduce the risks. Its only in their best interests that they do.
 

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,404
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
Tesla's collected data have consistently shown that their cars with autopilot on are appreciably safer than those with it off. Therefore, driving in a Tesla equipped with autopilot with the autopilot off is child endangerment.

The delays are because it takes more time to get those extra 9' than one thinks. Musk made a reference to that fact in one of his presentations - battery day I think it was.

As for the press: they say Tesla crashes are caused by autopilot even when it is off (recent case in Texas).
 
OP
OP

firsttruck

Well-known member
Joined
Sep 25, 2020
Threads
177
Messages
2,575
Reaction score
4,103
Location
mx
Vehicles
none
Country flag
Tesla's collected data have consistently shown that their cars with autopilot on are appreciably safer than those with it off. Therefore, driving in a Tesla equipped with autopilot with the autopilot off is child endangerment.

Cruise control has been around for many many decades.

Is Tesla Autopilot advertised as beta software?

My comment here is about Tesla FSD Betas.
 

CyberMoose

Well-known member
First Name
Jacob
Joined
Aug 19, 2020
Threads
1
Messages
820
Reaction score
1,415
Location
Canada
Vehicles
Model 3
Country flag
It's possible that a child will distract the driver at a time that FSD might make a mistake. It's also possible that a child will distract the driver at a time that another driver on the road makes a mistake.

I've seen a lot of FSD beta videos and while I haven't actually seen any with a child in the vehicle with the beta version, I think that the beta version is extremely cautious and I don't really recall seeing too many examples that the beta might have gotten into a collision at more than 5-10mph.
What I do know is that whether it's beta fsd, fsd, autopilot, or just driving in a Tesla at all, those cars seem more safe than any other. I've seen videos of Teslas avoiding collisions at high speeds, end up fishtailing a little, and then recover perfectly. I've seen tons of videos of non Teslas in similar situations that either get in the collision or they can't recover when they avoid the collision.


I don't really know the information that the closed beta group gets, whether or not there are any rules to follow like not having kids when the beta is engaged or if they are allowed to drive it like any other. But I think if someone is being responsible, there shouldn't be any increased risk. If someone is doing some stupid youtube video where they aren't paying attention, that's a completely different story obviously. But common limited driver distrations where the driver looks away for a second or two shouldn't be any more dangerous since the beta gets tested before going to the closed beta or open beta group. Just like how it's possible Tesla could release a new version without releasing a beta. i wouldn't consider it unsafe because Tesla has probably tested it for safety more than anything else.
 


JBee

Well-known member
First Name
JB
Joined
Nov 22, 2019
Threads
18
Messages
4,772
Reaction score
6,147
Location
Australia
Vehicles
Cybertruck
Occupation
. Professional Hobbyist
Country flag
So there are various perspectives on the matter I suppose. It is a larger issue than just FSD that needs a look at, as there are cascading system dependencies.

After thinking about it for a while, I think In the end it will come down to responsibilities and reactions in the boundary layers of the systems of law, risk assessment, morals, technical competence, driver capability and vehicle risks.

Let start from the top. Laws are slow. Typically, they only come about with mass outcry, obvious abuse of common rights, monopolies fortifying positions (aka big tech), in fact much law (especially government law) revolves around corporate operations, as financial means and incentives dictate much of what companies, and persons (as Corporate and not natural persons) do.

So on Tesla's side there are heaps of responsibilities, but the two main ones is brand damage (= sales loss) and negligence. I doubt Tesla would be reckless enough to be intentional, and pursue shortcuts that have risk. Negligence is one of those things that typically only raises it's head after an event, and is best exposed through hindsight.

This is why it's important for them to reduce risk by: (aka the risk assessment layer)
  1. Limiting the risk exposure to beta testers by having a small sample group and;
  2. ensuring beta testers are fully aware of the risks and;
  3. are competent and skilled enough to add a control if the risk occurs to reduce the consequence
  4. And progressively add features until deemed stable, and so only incrementally add risk piece by piece
To limit exposure further I'd expect that beta and FSD users will have entered into contract where they have indemnified Tesla for any potential liabilities arising from the use of FSD, before they can use it. This common practice, but it does limit the manufacturers exposure, and leaves a higher proportion of reponsibility on the user.

This leads us across the next boundary layer to morals, and the responsibility of users to make the correct decision. In order for a person to make a "reasonable" decision though, also requires them to have reasonable knowledge about the decision they are trying to make. This is a very contentious subject to navigate in law, without adding some common sense and morals to the mix. There is no doubt that we have enough ignorant and/or irresponsible people, that are careless to boot.

Accordingly, it has become hard to draw a line where the boundary of reponsibility lies, and is what fills the courts with cases, for it to decide in lieu of any direct authority.

So because of that, I can only really offer my opinion on the situation below, because there are simply too many variables to give a answer with any high likelihood of being correct. Law and courts aren't immune from appeals either. Corruption also rears its head and humans are fallible too.

My personal belief is that parents should be responsible caregivers to their dependents. Under the proviso, that governments are there to protect our rights (not to give them), which in turn means that the burden of responsibility lies with the caregiver, not the government, to act in a appropriate responsible manner. Law should be the minimum standard we live by, failing which we are penalized and removed from society (another contentious issue I won't to get into now). The point is we should not be bouncing off the minimum standard as the "accepted behavior", but should endeavor to formulate a society that is better than that.

This all gets pretty muddy in society though, as different opinions (including mine) are often persuaded by mob rule and popularity contests, resulting in media bias to propagate whatever "truths" result in user interaction and commercial returns. In the end we do not have a singular, individual consciousness, and our mind is shared by our intellectual inheritance. Sometimes that inheritance is a hinderance and not a help. (hence first principles thinking trying to navigate to the root cause)

This comes back to the caregivers responsibility, and their capacity to instill the appropriate values in the dependents. In the end the smallest unit of society are the parents, then family, then (not in order) relatives, friends, neighbors, communities (social, sport, education, online etc), states, and countries. As such caregiver responsibilities and the minefield of risks is enormous.

That brings us to the technical layer. Without a substantive and up to date understanding of current FSD and vehicle technology, which I'd say 99% of the population doesn't have because they aren't in that industry, it would be difficult for any caregiver to be in a position to make a "reasonable" decision on using FSD. This in turn, puts a lot of pressure upstream on the manufacturer, in that they can't reasonably require the caregivers to all be experts, before they can drive with FSD. That could mean that if not enough warnings are given by the manufacturer, probably with some sort of independent professional advice as to its effects, they could be found negligent.

The positive part to the technical component are the frequencies at which they operate, that are faster than human reflexes, and typically have much better situational perception. In the case of preventative measures, radar cruise control, automatic collision detection and braking, even steering to avoid impact, with people veering in lanes next to you or rear ending you, they all considerably help safety. Same with all the post collision systems, where airbags, seat belts and with force limiters and pre tensioners, along with impact absorbing and deflecting structural components, all operate outside the limited bounds of human perception, making their interaction with them meaningless.

And this I suppose is the next step, where FSD, becomes an even more predictive and pre-emptive vehicle control method. I read once that if only 15% of vehicles would have radar cruise control to keep the correct distance in traffic, something like 70% of congested traffic stop and go (a waveform surprisingly) would disappear. The reasoning here is that although these systems can sometimes fail, they conversely also are always "on".

There are no human drivers with that are always on, with 360 degree perception and lightning reflexes and super human, strong as steel, strength should it all go wrong. There are a multitude of risks whilst on-road, to the point an argument could be made the children should not be in a vehicle at all, FSD or not. Parents drink, smoke, do drugs, suffer from fatigue (often), are in a rush, have other problems on their mind, worry about the future or other tasks to do etc etc. All things that are happening in a vehicle moving at lethal velocities, and are distracting the driver from the primary task. And that doesn't include the other things that are on the road that can happen too, that are mostly outside of the carers control, like poor road maintenance, poor weather, visibility, vehicle condition, other road users (and abusers), or wildlife.

My personal opinion is that for me FSD will be a always on assistant co-pilot, but in the end I will be the captain, and with that the responsibility lies with me, as caregiver, to whom I expose those risks too.
 
Last edited:

JBee

Well-known member
First Name
JB
Joined
Nov 22, 2019
Threads
18
Messages
4,772
Reaction score
6,147
Location
Australia
Vehicles
Cybertruck
Occupation
. Professional Hobbyist
Country flag
The software as it exists is Beta. It is meant to be monitored.

How are children any more of a risk in this situation, where there are multiple systems/person monitoring versus only that one person?

-Crissa
I'd image that would be human driver complacency and at times poor assessment and reaction to a real time event. One of the issues I've seen from youtube, is that there is significant latency between appropriate reaction by the FSD and a faulty one, leaving the beta driver trying to catch up to rectify it. Had the driver been in control it would of been safely navigated from the start.

The FSD needs to display what it has chosen to do on the display well before it does it, so the driver has time to compensate.

Like here:
 

Crissa

Well-known member
First Name
Crissa
Joined
Jul 8, 2020
Threads
127
Messages
16,612
Reaction score
27,665
Location
Santa Cruz
Vehicles
2014 Zero S, 2013 Mazda 3
Country flag
Had the driver been in control it would of been safely navigated from the start.
That's not how human drivers work.

-Crissa

PS, your example shows someone speeding through a construction zone that is very poorly marked. There should not be a 30mph sign when you clearly need to stop in a hundred feet. And the beta saw the cones and warned the driver.
 

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,404
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
It's really quite simple. If the car's intrinsic safety level be, for a 20 mile trip and derived from Tesla's historical data, 5 nines and you drive drunk you take away a good part of one of those nines and we all recognize that it is irresponsible to do that whether you have children on board or not and whether you be drunk on beer or whisky. But Tesla's data also shows that if autopilot is on the car's safety for a 20 mile trip rises to 5.22 nines. If you don't turn it on you deny passengers and bystanders 0.22 nines security and that's clearly irresponsible too whether the software be alpha, beta or release.

The auto pilot the hoi polloi have is beta. If you are talking about what the privileged few have I am not sure that it is even considered beta yet. And, of course, Tesla wouldn't have enough historical data on it yet to make meaningful statements about its reliability. AFAIK there has not yet been an accident using it so if the testers have logged a million miles on it (100 drivers at 10,000 miles per year for 1 year) all we can say is that it appears at this point to be better 4.4 nines.
 


OP
OP

firsttruck

Well-known member
Joined
Sep 25, 2020
Threads
177
Messages
2,575
Reaction score
4,103
Location
mx
Vehicles
none
Country flag
It's really quite simple. If the car's intrinsic safety level be, for a 20 mile trip and derived from Tesla's historical data, 5 nines and you drive drunk you take away a good part of one of those nines and we all recognize that it is irresponsible to do that whether you have children on board or not and whether you be drunk on beer or whisky. But Tesla's data also shows that if autopilot is on the car's safety for a 20 mile trip rises to 5.22 nines. If you don't turn it on you deny passengers and bystanders 0.22 nines security and that's clearly irresponsible too whether the software be alpha, beta or release.
Again you bring up "Tesla Autopilot". I already corrected you once.

The title of this thread and my original post is concerning FSD Beta NOT "Tesla Autopilot"

Right now "Tesla Autopilot" is not for city streets & intersections.

Does Tesla even claim that "Tesla Autopilot" can do lane changes on the highway?


Tesla > Support - Autopilot and Full Self-Driving Capability
https://www.tesla.com/support/autopilot

Autopilot
Traffic-Aware Cruise Control: Matches the speed of your car to that of the surrounding traffic
Autosteer: Assists in steering within a clearly marked lane, and uses traffic-aware cruise control


----------
The auto pilot the hoi polloi have is beta. If you are talking about what the privileged few have I am not sure that it is even considered beta yet. And, of course, Tesla wouldn't have enough historical data on it yet to make meaningful statements about its reliability.
----------

Really !!!!!!

Where is the evidence of how many Tesla customer FSD BETA drivers that actually spend even 2,000 miles in 6-months in FSD BETA mode ( NOT Autopilot) where most cars are doing at least 2,000 mi of uninterrupted traveling through city street intersections making unprotected right/left hand turns, changing lanes in city streets, obeying all city street traffic signals & signs.
This would be multiple hours a day, 5 days per week for months. I have not seen any Tesla customer using FSD BETA mode even claim to be able to use it that much.

FSD BETA mode NOT Autopilot

AFAIK there has not yet been an accident using it so if the testers have logged a million miles on it (100 drivers at 10,000 miles per year for 1 year) all we can say is that it appears at this point to be better 4.4 nines.
I do not know about actual accidents but I have seen enough Youtube videos of close calls for serious accidents.

There is no way it is even 3 nines.
 
Last edited:
OP
OP

firsttruck

Well-known member
Joined
Sep 25, 2020
Threads
177
Messages
2,575
Reaction score
4,103
Location
mx
Vehicles
none
Country flag
Tesla Support - Autopilot and Full Self-Driving Capability
https://www.tesla.com/support/autopilot

------------------

Autopilot
Traffic-Aware Cruise Control: Matches the speed of your car to that of the surrounding traffic
Autosteer: Assists in steering within a clearly marked lane, and uses traffic-aware cruise control

------------------

Full Self-Driving Capability

Navigate on Autopilot (Beta): Actively guides your car from a highway’s on-ramp to off-ramp, including suggesting lane changes, navigating interchanges, automatically engaging the turn signal and taking the correct exit

Traffic and Stop Sign Control (Beta): Identifies stop signs and traffic lights and automatically slows your car to a stop on approach, with your active supervision

Upcoming: Autosteer on city streets

Auto Lane Change: Assists in moving to an adjacent lane on the highway when Autosteer is engaged

Autopark: Helps automatically parallel or perpendicular park your car, with a single touch

Summon: Moves your car in and out of a tight space using the mobile app or key

Smart Summon: Your car will navigate more complex environments and parking spaces, maneuvering around objects as necessary to come find you in a parking lot.

------------------
 

JBee

Well-known member
First Name
JB
Joined
Nov 22, 2019
Threads
18
Messages
4,772
Reaction score
6,147
Location
Australia
Vehicles
Cybertruck
Occupation
. Professional Hobbyist
Country flag
Tesla Support - Autopilot and Full Self-Driving Capability
https://www.tesla.com/support/autopilot

------------------

Autopilot
Traffic-Aware Cruise Control: Matches the speed of your car to that of the surrounding traffic
Autosteer: Assists in steering within a clearly marked lane, and uses traffic-aware cruise control

------------------

Full Self-Driving Capability

Navigate on Autopilot (Beta): Actively guides your car from a highway’s on-ramp to off-ramp, including suggesting lane changes, navigating interchanges, automatically engaging the turn signal and taking the correct exit

Traffic and Stop Sign Control (Beta): Identifies stop signs and traffic lights and automatically slows your car to a stop on approach, with your active supervision

Upcoming: Autosteer on city streets

Auto Lane Change: Assists in moving to an adjacent lane on the highway when Autosteer is engaged

Autopark: Helps automatically parallel or perpendicular park your car, with a single touch

Summon: Moves your car in and out of a tight space using the mobile app or key

Smart Summon: Your car will navigate more complex environments and parking spaces, maneuvering around objects as necessary to come find you in a parking lot.

------------------
I think you have demonstrated the difference between FSD and Autopilot.

Do you have any more to add to the other comments about it being questionable to let children ride with beta testers?
 

ajdelange

Well-known member
First Name
A. J.
Joined
Dec 8, 2019
Threads
4
Messages
3,213
Reaction score
3,404
Location
Virginia/Quebec
Vehicles
Tesla X LR+, Lexus SUV, Toyota SR5, Toyota Landcruiser
Occupation
EE (Retired)
Country flag
Again you bring up "Tesla Autopilot". I already corrected you once.

The title of this thread and my original post is concerning FSD Beta NOT "Tesla Autopilot"
If you had the technical depth you would understand that it doesn't matter which we are talking about. There have been no accidents with the newer system. Therefore, the best estimate of its reliability is "better than 4.4 nines" (assuming they have a million miles on it and we are talking 20 mile trips).

I do not know about actual accidents but I have seen enough Youtube videos of close calls for serious accidents.
When you do, come back and we'll resume the conversation. You base your conclusions on YouTube? No further comment on this that could be considered kind is possible. I will point out that I have, while actually driving a Tesla with its beta autopilot taken it out of autopilot many times to avoid what I thought might turn into an accident or at least incident. Those events don't count as accidents. Should I not use autopilot when there are passengers aboard? Exactly the same reasoning applies with the only difference being that as there have been accidents with autopilot we can compute a reliability number ( 5.22 nines -which is, of course, an estimate based on the current history) but as there haven't been any for FSD V9 beta we can only boud it at better than 4.7 (based on my guess as to how many miles the beta testers have logged and 20 mile trips. I'm guessing there are at least 100 beta testers and that they, like most of us, drive 10,000 miles per year and it has been out since last Nov(?). If you don't like a million miles or 20 mile trips use what ever numbers you like in
-log(1 -(1-1/mi)^mt) ~ -log(mt/mi)

e.g. -log(1 - (1 - 1/500000)^20) =4.39795 for 500,000 miles driven in 20 mile trips.

From the formula you can determine how many miles the beta testers have to go to demonstrate that it is safer than not using it. - log(1 - (1 - 1/2e6)^20) = 5.00 so once they have logged 2,000,000 accident free miles that is demonstrated. Perhaps that's the point at which they will release it to the rest of us.

It is indeed quite simple if you have a fundamental understanding of what these numbers mean.
 
Last edited:

Jamez

New member
First Name
James
Joined
Dec 9, 2020
Threads
0
Messages
4
Reaction score
0
Location
Kitchener Ontario
Vehicles
CYBERTRUCK
Occupation
Taper of Dry Wall (retired)
Country flag
Something that has been bother me for a while is FSD drivers who make videos with children in the vehicle.

Even though driver is monitoring the car and might take over there is inherently a delay in response that in a split second situation that could make difference between there being an accident.

Another thing is the child might require attention or distract the driver just when FSD Beta makes a mistake and the human driver monitoring beta misses clues or is too slow to react.


I have seen some with a father & toddler (or infant) and other similar situations.
I stop watching any video where I find children are passengers in beta driven cars.

I consider driving a car with Beta software child endangerment.

Children can also not give consent.

Another related situation is when both spouses (or both caregivers) of children are in a beta driven vehicle. Even if children are not in car if there was an serious accident the children would lose all their primary caregivers.

I think Tesla should explicitly warn Tesla owners to not have children in car when FSD beta is being used.

I want FSD to be tested & become a finished product as soon as possible.

Another thing from Tesla business perspective is the huge bad publicity Tesla would get if there was a bad accident with a FSD Beta car that had children in the car.
TO: 'I consider," and what's your point. That you care more for children ... Then others you point to as if real and imagined are the exact same thing as potential "if" possible maybe "if" world events where real events. IF this that happened at just the right time.. something maybe might happen where.... I told you so... Get a real life
.. And dismount that I horse... Even if the ride makes you look good. And you don't want to cuz you'd rather not.
Sponsored

 
 




Top