Recently, we reported on NHTSA opens investigation into Teslas with its active level 2 semi-automated autopilot driving system crashed in emergency vehicles, such as police cars or fire trucks. It appears that another such accident may have occurred Saturday morning, when a Tesla Model 3 crashed into a police car that was stopped on the side of the road to help a motorcyclist in Orlando, Florida. Fortunately, no one was injured, but it seems to reinforce the idea that the autopilot has an authority issue.
A Florida Highway Patrol (FHP) statement reported that an FHP officer had stopped to help a disabled driver Mercedes-Benz GLK 350 2012 on the shoulder of I-4 around Orlando.
In 2019 Tesla Model 3, which the driver claims was in autopilot mode, hit and cleaned the police car, and then hit the affected Mercedes as well, perhaps just to be reallysure and sure everyone’s day would be ruined.
There is currently an investigation to really confirm the reports on the status of the autopilot and what role it played in the crash; it is certainly not impossible that this information may be incorrect, but again, this type of incident is what had been seen before with autopilot, hence the NHTSA investigation.
G / O Media may receive a commission
I put my stance on autopilot and really all level 2 semi-automated fairly clear driving systems: they are not needed for technical reasons, but for concepts that have to do with the way humans — Teslas ’main target market and many other new vehicles — interact with them. And I’m not alone in thinking that.
Humans are good enough to avoid emergency vehicles parked on the side of the road. Autopilot seems to be very bad. If the autopilot were used correctly in this case, the human driver would have seen that the car was deciding to drive hard into the police car and would have taken it.
But the problem is that when a system does virtually all of the actual driving, as autopilot can often do in road situations, and as the system is shown to work in general, humans are terrible at keeping their attention focused on system control.
It is the fault of the human being, for sure, but it is also the result of a bad system that does not take into account the actual functioning of human beings. Is it like that like that terrible hockey mouse that Apple did about 20 years ago: technically, it was fine, but the design didn’t take into account the way human hands were formed, and as a result, it was a disaster.
These mice didn’t crash into police cars on the side of the road, it’s also worth mentioning. Autopilot and other L2 systemsthey make the same basic mistake by ignoring the actual functioning of humans.
I am curious to see the results of this research and whether the video can be removed from the car. If it proves the autopilot was in control, I hope this would spur Tesla to really focus on improving the ability to avoid cars parked on the sides of the roads in an upgrade of the system as soon as possible.