Recently, we. informed about NHTSA has launched an investigation into Tesla with its Autopilot Level 2 semi-automated driving system activated AccidentEd in emergency vehicles, such as police cars Or fire trucks. Looks like there must have been another incident like this On Saturday morning, a Tesla Model 3 collided with a police car parked on the side of the road to help a motorist in Orlando, Florida.. Thankfully, no one was injured, but it appears to reinforce the idea that the autopilot has a problem with authority.
A Florida Highway Patrol (FHP) statement said an FHP officer had stopped to assist a disabled driver On the shoulder of the 2012 Mercedes-Benz GLK 350 I-4 Orlando.
2019 The Tesla Model 3, which the driver claims was in Autopilot mode, hit a police car and sidewiped, then hit the stricken Mercedes too, probably just to be actually ThorOh and sure enough, everyone’s day will be ruined.
An investigation is currently underway to reconfirm the factabout the ports the position of the autopilot and its role in the accident; It’s certainly not impossible that this information could be proven to be false, but again, this sort of phenomenon is a phenomenon that has been observed with autopilot before, Hence the NHTSA investigation.
I’ve made my stance on autopilot, and, in fact, All Level 2 Semi-Automatic Driving systems pretty clear: They suck, not necessarily for technical reasons, but for the ideological ones that humans—the primary target market of Tesla and many other new cars—interact with. And I’m not the only one to think this.
Humans are very good at avoiding emergency vehicles parked on the side of the highway. Autopilot seems pretty bad in this. If the autopilot was being used properly in this instance, the human driver would have noticed that the car was deciding to drive smack into the police car, and took it.
But the problem is that when a system is doing almost all of the actual driving – like the autopilot can often do in highway conditions and how the system is typically displayed – humans are terrified about keeping the system focused on monitoring. Huh.
It’s human fault, of course, but it’s also the result of a bad system that doesn’t take into account how humans actually work. this is Like that awesome hockey puck mouse Apple made about 20 years ago: Technically, this was fine, but the design didn’t take into account how human hands were actually shaped, and as a result, it was a disaster.
Those rats didn’t crash into police cars on the side of the road, it’s also worth noting. Autopilot and other L2 sysBy not paying attention to how humans actually work, people are making the same basic mistake.
I look forward to seeing the results of this investigation, and if video can be pulled from the car. If it showed Autopilot was under control, I would expect Tesla to actually focus on improving the system’s ability to avoid parked cars on the sides of the road as soon as possible in an update to the system.