JALOPNIK: Why A Tesla Using FSD Running A Stop Sign Isn't Necessarily Terrible

FutureBoy

Well-known member
First Name
Reginald
Joined
Oct 1, 2020
Threads
207
Messages
3,522
Reaction score
6,012
Location
Kirkland WA USA
Vehicles
Toyota Sienna
Occupation
Financial Advisor
Country flag
Why A Tesla Using FSD Running A Stop Sign Isn't Necessarily Terrible
I mean, it's not great, but it's a good reminder of the incredible cultural complexity of driving
By
Jason Torchinsky


Illustration: Jason Torchinsky

Videos showing people using the latest version of Tesla’s still misleadingly-named Full Self Driving (FSD) Beta v10 have been showing up on the internet lately, and as seems to be the case with every release, there’s a pretty varied mix of impressively competent, even mundane drives, and alarmingly bad ones where the car seems confused and hesitant and disengages constantly.

Here, to be fair, I’ll give examples of each. Here’s one where things go pretty well, in San Francisco:





There’s a lot to be impressed by there; it’s not perfect, but it’s doing some complex and impressive things, even seemingly deciding to take some initiative on a right-turn-on-red, which is a pretty sophisticated move.

On the other side of the spectrum is this unedited video of a drive in excellent weather and visibility conditions, in a city with, it appears, less traffic than what was seen in the San Francisco video above, and yet FSD handled driving here with all of the aplomb, skill, and confidence of a ferret that’s just been handed an iPad and asked to look up something on Wikipedia.



It’s not good. Even if we average out the performance of FSD between these two videos, the end result is not something that is remotely close to anything like “full self driving,” no matter what its name claims.

It’s impressive in many ways, and a fascinating work in progress, but it’s not done, and here it is, deployed on public roads via 4,500 pounds of mobile computing hardware, surrounded by people who definitely did not opt into this testing.

Setting ethics and repurcussions of public testing aside for the moment, seeing the results of these tests is interesting. One example that caught my attention was in this tweeted video of a short nighttime FSD drive:





What I find interesting here happens about nine seconds into the video, when the Tesla approaches a stop sign, and proceeds to roll through it at about four mph.

While we’ve very likely all done this exact same maneuver many, many times, especially on similarly empty nighttime streets, it is technically illegal. That Tesla ran a stop sign. It’s a robot, not a people, and as such shouldn’t be susceptible to the base, reprobate urges that push us into minor crimes, like stop-sign-roll-through or sneakily and messily devouring the contents of a Pringles can while hiding your head in the milk fridge at a grocery store.

But it did commit that minor infraction, and while many are calling this out as a problem, I’m not so sure I think this sort of action from an automated driving system is such a bad thing, because the world is very complicated.

I don’t know if there’s any algorithm in Tesla’s FSD stack that has some sort of IF-THEN conditional that takes into account IF nighttime AND IF no traffic AND IF stop sign THEN roll through at low speed, but if that did exist, I don’t think I’d necessarily think that would be a problem.

I say this because human driving is complex and nuanced, and there are times when following the letter of the law is not the best choice for a given situation.

For example, there are the traffic laws that are written on the books, and then there are the traffic laws as they are actually practiced. I covered this a good bit in my book, so I’ll just excerpt that here instead of re-writing it:

Making things even more difficult is the fact that these unwritten rules are extremely regional, and every major metropolis seems to have its own dialect of driving and its own set of unwritten rules. In Los Angeles, for example, there is an extremely consistent and rigid unwritten rule about turning left at an unprotected traffic light. That is, a left turn at an intersection with no provision for a green arrow traffic signal.
The Los Angeles rule is that when the light goes from green to yellow to red, up to three cars waiting to turn may turn on the red light. I lived in Los Angeles for over 17 years, and this rule was one of the most consistent things in my life there. Every Los Angeleno seemed to know about the three-cars-on-a-red rule, and when I described it to anyone else in the country, they looked at me like I was an idiot. And not the usual kind of idiot I’m assumed to be; a dangerous idiot.
Should a robotic vehicle follow this unwritten LA traffic rule? It’s technically illegal, but in practice it’s the norm, and not acknowledging the rule could potentially create more issues than just going with it would. I know if I was in the middle of an intersection when the light went red and some stupid robe-car in front of me refused to make the turn, it’d drive me batshit. I don’t think I’m the only one.
Ignoring the three-cars-on-a-red rule in LA would make human drivers hate automated cars, and would cause more traffic problems. Same goes for cars in big cities with lots of pedestrian traffic, like New York or Mexico City, for example, where drivers often have to edge slowly into busy crosswalks just to be able to demonstrate an intent to actually move; a totally stationary car will be stuck there forever, as the masses of traffic-jaded pedestrians will just keep walking past.

Pushing into the crosswalk while there are people walking there is technically not exactly legal; and yet it’s a crucial part of the dance that keeps traffic flowing.

Once you start thinking about this, there are so many examples: crossing a double yellow to give room to a broken-down car or a cyclist on a narrow road, avoiding an obstacle by driving on the shoulder or into a bike or bus lane, speeding up on a yellow instead of slowing to avoid a hard braking situation at a stoplight, and so on.

None of those are necessarily ideal and all are technically illegal to some degree, but the results those actions provide are better than the outcomes of attempting to follow the law to the letter.

Really, the ability to understand when rule-breaking makes sense is a good example of the top-down reasoning vs. bottom-up reasoning that makes the problems of self-driving so difficult.

Absurdly simplified, this concept notes the difference between how humans drive (top-down, meaning we start with an overall understanding of the entire context of the environment we drive in) and bottom-up, like a computer, which reacts to sensor input and rules without really understanding the overall situation.

This is one of the hardest obstacles for self-driving car tech to overcome. It’s not just about sensors and neural networks and powerful computers—we have to figure out ways to synthesize our cultural knowledge surrounding driving, which is a big deal.

The good news is that I think there are companies actively thinking about this. I recently met with some people from Argo AI, the ones using a disguised pre-production Volkswagen ID Buzz for their testing. I’ll have a bigger article on them soon, but for the moment, here’s a teaser pic:

Tesla Cybertruck JALOPNIK: Why A Tesla Using FSD Running A Stop Sign Isn't Necessarily Terrible 1631647397961

Photo: Jason Torchinsky

Their approach to automated driving is quite different than Tesla’s, which, again, I’ll get into soon, but the key thing that came up in our conversation that encouraged me that at least these harder-to-define issues are being considered was one word: Halloween.

The Argo engineers understood that there are times when, for reasons that have absolutely nothing to do with driving, all the rules change. During Halloween, kids won’t necessarily look like kids, and they’ll be moving all over the roads, at night, in patterns and paths that do not happen any other time of year.

Whatever an AI thinks it understands about pedestrian behavior does not apply during the candy-fueled madness of Halloween. And the engineer I spoke with understood this, and this was considered to be a valid problem that needed some sort of solution.

Would they special-case October 31st and have the car operate under an entirely different set of rules? Would the speed limit be much more severe on this night? Would the lidar or camera setups operate differently?

I don’t know, but I do know that Halloween is just one of many, many bits of glorious chaos that makes human life so wonderful, and such a hell for machines to understand.

But it’s our job to make these machines understand, even if that means, sometimes, breaking some of the normal rules of the road.

None of this is easy, and it’s good to remember that.
Sponsored

 
Last edited:
OP
OP
FutureBoy

FutureBoy

Well-known member
First Name
Reginald
Joined
Oct 1, 2020
Threads
207
Messages
3,522
Reaction score
6,012
Location
Kirkland WA USA
Vehicles
Toyota Sienna
Occupation
Financial Advisor
Country flag
I initially started reading this article for the FSD videos. But I like the point about Halloween. I'll be curious to see how Tesla FSD handles Halloween and other interesting holidays. For instance, how does FSD handle coming up on a parade? Does it wait for the parade to pass? Or try to slowly shuffle through the middle?

Once I was driving through Montana on I-90 in the middle of the night going over a high mountain pass in thick fog. Suddenly as I rounded a corner in the middle lane of the 3 lanes, there were 2 hitchhikers with large backpacks standing in the right-hand lane trying to get a ride. They looked like ghosts. I nearly had a heart attack. But hey, if I had FSD at that time, would FSD have stopped for them?

So many strange situations happen in real life. I'm curious how FSD will come to handle all the strange edge cases.
 

Sirfun

Well-known member
First Name
Joe
Joined
Dec 28, 2019
Threads
55
Messages
2,389
Reaction score
4,872
Location
Oxnard, California
Vehicles
Toyota Avalon, Chrysler Pacifica PHEV, Ford E-250
Occupation
Retired Sheet Metal Worker
Country flag
I wonder if the algorithm has to do with it being a 4-way stop, and obviously no cars near to the intersection there's no need to stop.
That would be an interesting conversation with the cop sitting there waiting to write tickets for that very infraction. Hopefully the Tesla engineers have the cameras identify police vehicles and adjust the driving style. :ROFLMAO:

BTW, About 40 years ago I got a ticket along these lines. I was riding my motorcycle up to a 4-way stop and there were no cars approaching, and I was doing a slow roll without putting my foot down to stop, when I noticed a parked cop car 100feet up the street to my left. I immediately stopped, and he started rolling, I sat there and waited for him at the stop sign, and he motioned for me to continue and turned on his red light. I pulled over, and he proceeded to tell me how he was going to write me a ticket for failing to stop behind the limit line. I WAS PISSED! I explained how ridiculous it was that my front tire was all that crossed the line. But he had an agenda.
 
OP
OP
FutureBoy

FutureBoy

Well-known member
First Name
Reginald
Joined
Oct 1, 2020
Threads
207
Messages
3,522
Reaction score
6,012
Location
Kirkland WA USA
Vehicles
Toyota Sienna
Occupation
Financial Advisor
Country flag
I wonder if the algorithm has to do with it being a 4-way stop, and obviously no cars near to the intersection there's no need to stop.
That would be an interesting conversation with the cop sitting there waiting to write tickets for that very infraction. Hopefully the Tesla engineers have the cameras identify police vehicles and adjust the driving style. :ROFLMAO:

BTW, About 40 years ago I got a ticket along these lines. I was riding my motorcycle up to a 4-way stop and there were no cars approaching and I was doing a slow roll without putting my foot down to stop, when I noticed a parked cop car 100feet up the street to my left. I immediately stopped And he started rolling, I sat there and waited for him at the stop sign and he motioned for me to continue and turn on his red light. I pulled over and he proceeded to tell me how he was going to write me a ticket for failing to stop behind the limit line. I WAS PISSED! I explained how ridiculous it was that my front tire was all that crossed the line. But he had an agenda.
I find that once you have come to the attention of law enforcement, there is very little if anything that can be done to avoid getting into some sort of negative situation. Sure, there are the crying beauties that get out of tickets. But really, even that is a negative situation if only for the need to clean up the mascara afterward.

And there are times when you are just going to get stopped no matter what. For instance, I was recently (pre-covid though) at a large outdoor music festival in a remote town. The town had brought in law enforcement support from many towns around to help for the festival days. Each night as the festival wound down, it was nearly impossible to drive away from the festival toward one's hotel without getting stopped somewhere along the line. Nearly every intersection out of town (most hotels were in neighboring towns) had 1 or more patrol cars sitting in the dark looking for someone to pull over. I couldn't even count the number of people I personally saw pulled over.
 

Ogre

Well-known member
First Name
Dennis
Joined
Jul 3, 2021
Threads
164
Messages
10,719
Reaction score
26,998
Location
Ogregon
Vehicles
Model Y
Country flag
For instance, how does FSD handle coming up on a parade? Does it wait for the parade to pass? Or try to slowly shuffle through the middle?
Presumably it will route around a parade before it gets there due to the road closure. Otherwise, it's a serious routing failure.

An impromptu protest? Who knows.

This is one of the reasons I don't think Level 5 is attainable for a long long time. There are enough weird randomness in the world where a robot with a single minded purpose *get from point A to point B* is going to fail. We might linger at something like Level 4++++ for decades, but these edge cases will exist for a long time.

I think we've all at some point been routed poorly and ended up having to do some creative re-routing or even backing up down a road to get out of a sticky situation. FSD isn't designed to handle that kind of decision tree. "Make a technically illegal U-Turn to avoid a flooded street" is not in the lexicon of robots. "Back out of the one way ally you are on because it's completely blocked" also not in there.

When Musk talks about getting rid of the steering wheel I'm a bit perplexed. How do you get a "FSD" car with no steering control out of some of these situations where its clearly not possible to proceed normally? Even if it's a little pop-out Nintendo style controller you gotta have some kind of abort sequence.
 


Bill906

Well-known member
Joined
Mar 21, 2020
Threads
4
Messages
1,386
Reaction score
3,229
Location
Wisconsin
Vehicles
Jeep
Country flag
When Musk talks about getting rid of the steering wheel I'm a bit perplexed. How do you get a "FSD" car with no steering control out of some of these situations where its clearly not possible to proceed normally? Even if it's a little pop-out Nintendo style controller you gotta have some kind of abort sequence.
I"m not saying you are right or wrong, just commenting that after reading your post, I have this picuture in my head of a Teslabot dancing and singing "Anything you can do I can do better..."
 

Crissa

Well-known member
First Name
Crissa
Joined
Jul 8, 2020
Threads
126
Messages
16,211
Reaction score
27,073
Location
Santa Cruz
Vehicles
2014 Zero S, 2013 Mazda 3
Country flag
He's wrong about the three-cars thing. It's 'cars that are in the intersection' and that's true everywhere. No cars should go over the limit line after yellow, but still may, and none may after the light turns red.

The size of the intersections means that's a car in the crosswalk, a car in crossing the lane, and a car in the median, all waiting for traffic to clear.

He's a walking example of how easy it is to learn the wrong lesson.

The Halloween example is simple: N kids to adults in a gaggle means there's an event, be careful. AI Day showed them simulating that sort of thing, with people and dogs on a freeway.

-Crissa

PS, I usually have to put my front wheel on or over the stop-line to put my motor in the middle of the cycle sensor to get lights to see my motorcycle!
 
Last edited:

firsttruck

Well-known member
Joined
Sep 25, 2020
Threads
172
Messages
2,537
Reaction score
4,036
Location
mx
Vehicles
none
Country flag
When Musk talks about getting rid of the steering wheel I'm a bit perplexed. How do you get a "FSD" car with no steering control out of some of these situations where its clearly not possible to proceed normally? Even if it's a little pop-out Nintendo style controller you gotta have some kind of abort sequence.
Yup, joystick in car or control by remote operator at Tesla mothership
 

CyberGus

Well-known member
First Name
Gus
Joined
May 22, 2021
Threads
67
Messages
5,810
Reaction score
19,084
Location
Austin, TX
Website
www.timeanddate.com
Vehicles
1981 DeLorean, 2024 Cybertruck
Occupation
IT Specialist
Country flag
A man was riding in a cab, when the driver floored it through a red light. "OMG are you crazy?" said the man, to which the driver said "Hey, my cousin does it all the time, it's fine."

Later, they came upon a green light, and the cab came to a full stop. "What are you waiting for?" inquired the man, "keep going."

"Are you crazy?" said the driver. "My cousin could be coming from the other direction!"
 

Ogre

Well-known member
First Name
Dennis
Joined
Jul 3, 2021
Threads
164
Messages
10,719
Reaction score
26,998
Location
Ogregon
Vehicles
Model Y
Country flag
Yup, joystick in car or control by remote operator at Tesla mothership
I would vastly prefer a yoke/ steering wheel that retracts and can be popped out quickly. If the police get on my tail, I'm not going to have time to wait for Tesla support to answer the phone.
Sponsored

 
 




Top