Why Running a Stop Sign Using a Tesla FSD Isn’t Always Terrible

Image for article titled Why Running a Stop Sign Using a Tesla FSD Isn't Necessarily Terrible

illustration: Jason Torchinsky

Videos showing people using the latest version of Tesla are still misleadingThe -named Full Self Driving (FSD) Beta v10 has been appearing on the internet recently, and as has been the case with every release, impressively capable, even mundane drives, and dangerously so There is a very diverse mix of bad guys where the car starts to get confused and hesitant and constantly falls apart.

Here, to be fair, I’ll give examples of each. Here’s one where things go pretty well in San Francisco:

There’s a lot to be impressed by; It’s not perfect, but it’s doing some complicated and impressive things, even deciding to take some initiative on right-turn-on-red, which is a pretty sophisticated move.

On the other side of the spectrum is this unedited video of a drive in excellent weather and visibility conditions, in a city it appears to have seen in the San Francisco video above, and yet the FSD handled the driving here all along A look at Wikipedia about the confidence, skill and confidence of a ferret who has just been given an iPad and asked.

it is not right thing. Even if we average out the performance of the FSD between these two videos, the end result isn’t something that’s very close to anything like “full self-driving,” whatever its name.

It’s impressive in many ways, and a fascinating work in progress, but it hasn’t been, and here it is deployed on public roads via 4,500 pounds of mobile computing hardware, which is certainly included in this test. Surrounded by people who haven’t.

setching Ethics And the side effects of public testing aside for the time being, it is interesting to see the results of these trials. One example that caught my attention was in this tweeted video of an FSD drive at night:

What I find interesting here is about nine seconds into the video, when the Tesla approaches a stop sign, and proceeds to roll at about four mph.

While we are all very likely to perform this exact same maneuver many, many times, especially on similarly empty night streets, it is technically illegal. That Tesla ran a stop sign. It is a robot, not a people, and thus should not be susceptible to the premise, the vindictive urge that pushes us into Small crimes, such as stop-sign-roll-through or stealthily and messily devouring the contents of Pringles, can make milk at the grocery store while hiding its head in the fridge.

but this did Commit to that minor infraction, and while many are calling it a problem, I’m not sure I think that kind of action from an automated driving system is such a bad thing, because the world is so complicated.

I don’t know if Tesla’s FSD stack has an algorithm that has some kind of IF-THEN conditional that takes into account the night time and roll at low speed if there is no traffic and a stop sign, but if He did exists, I don’t think I would necessarily think it would A Problem.

I say this because human driving is complex and finicky, and at times has to follow the letter of the law. No The best option for a given situation.

For example, there are There are traffic laws that are written on books, and then there are traffic laws as are actually practiced. I have covered it well in my book, so instead of rewriting it I’ll just excerpt here:

Making things even more difficult is the fact that these unwritten rules are extremely regional, and every major metropolis seems to have its own dialect of driving and its own set of unwritten rules. For example, in Los Angeles, there is an extremely consistent and rigid unwritten rule about turning left at an unsafe traffic light. That is, a left turn at an intersection without a green arrow traffic signal.

The law of Los Angeles is that when a light turns from green to yellow to red, three cars can turn on a red light. I lived in Los Angeles for over 17 years, and this rule was one of the most consistent things in my life there. Every Los Angelo seemed to know about the three-cars-on-a-red rule, and when I described it to someone else in the country, they looked at me like I was an idiot. And not the usual kind of idiot I’m supposed to be; a Dangerous Idiot.

Should a Robotic Vehicle Follow This Unwritten LA Traffic Rule? This is technically illegal, but in practice it is the norm, and not accepting the rule could potentially cause more problems than you go with it. I know if I was in the middle of an intersection when the lights turned red and some stupid robe-car in front of me refused to turn, it would drive me crazy. I don’t think I am alone.

Ignoring the three-cars-on-a-red rule in LA will make a human driver Hatred Automated cars, will cause more traffic problems. The same is true for cars in large cities such as New York or Mexico City, for example, where drivers often have to swerve slowly in busy crosswalks to be able to demonstrate intention to actually move; A perfectly stationary car will be stuck there forever, as the hordes of pedestrians vexed by traffic will just keep moving.

pushing in the crosswalk while there are The people walking there aren’t technically legal at all; And yet it is an important part of the dance that keeps the traffic flowing.

once you start thinking about it, So many examples: crossing a double yellow to give a broken car or cyclist space on a narrow road, avoiding an obstacle by driving on the shoulder or in a bike or bus lane, driving on the yellow instead of slowing down Accelerating Avoid a difficult braking situation at a stoplight etc.

None of them are necessarily ideal and all are technically illegal to some degree, but the consequences of those actions are better than the results of attempting to obey the law to the letter.

In fact, the ability to understand when it means breaking the rules is a good example. Top-down reasoning versus bottom-up reasoning is what makes self-driving problems so difficult.

Absurdly simplistic, this concept notes the difference between the way humans drive (top-down, meaning we begin with a holistic understanding of the overall context of the environment in which we drive) and bottom- Up, like a computer, that reacts to sensor inputs and rules without actually understanding the overall situation.

This is one of the most difficult obstacles for self-driving car technology to overcome. It’s not just about sensors and neural networks and powerful computers – we have to figure out ways to synthesize our cultural knowledge around driving, which is a big deal.

The good news is that I think companies are actively thinking about this. I recently met some people argo ai, those using a disguised pre-production Volkswagen ID Buzz for their testing. I’ll have a big article on them soon, but for the time being, here’s a teaser pic:

Image for article titled Why Running a Stop Sign Using a Tesla FSD Isn't Necessarily Terrible

Photo: Jason Torchinsky

Their approach to automated driving is quite different from Tesla’s, which I’ll get to again Soon, but the important thing that emerged in our conversation that encouraged me to consider at least one of these difficult-to-define issues was one word: Halloween.

Argo engineers understood that there are times when, for reasons that have nothing to do with driving, all the rules change. During Halloween, kids don’t necessarily look like kids, and they’ll be roaming the streets, in patterns and paths at night, at no time of year.

Everything the AI ​​thinks it understands about pedestrian behavior doesn’t apply during the candy-fueled madness of Halloween. And the engineer I spoke to understood this, and considered it a legitimate problem that needed some sort of solution.

Will they do a special case on October 31 and will the car operate under completely different rules? Will the speed limit be more severe this night? Will the lidar or camera setup work differently?

I don’t know, but I do know that Halloween is one of many, many pieces of spectacular chaos that make human life so wonderful, and machines such a hell to understand.

But it’s our job to explain these machines, even if it means, at times, breaking some general rules of the road.

None of this is easy, and that’s good remember that.

nonton the naked director season 2

Leave a Comment