TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5
Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:
- The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
- This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
- The crashes are overwhelmingly Teslas rear-ending motorcyclists.
Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.
Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.
Tesla self driving is never going to work well enough without sensors - cameras are not enough. It’s fundamentally dangerous and should not be driving unsupervised (or maybe at all).
Accurate.
Each fatality I found where a Tesla kills a motorcyclist is a cascade of 3 failures.
Taking out the driver will make this already-unacceptably-lethal system even more lethal.
… Also accurate.
God, it really is a nut punch. The system detects the crash is imminent.
Rather than automatically try to evade… the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.
Yep, that one was purely about hitting a certain KPI of ‘miles driven on autopilot without incident’. If it turns off before the accident, technically the driver was in control and to blame, so it won’t show up in the stats and probably also won’t be investigated by the NTSB.
Hopefully they wised up by now and record these stats properly…?
NHTSA collects data if self-driving tech was active within 30 seconds of the impact.
The companies themselves do all sorts of wildcat shit with their numbers. Tesla’s claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that’s what they say on their stock earnings calls. Of course, that’s not true, not based on any data I’ve seen, they haven’t published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).
If they ever fixed it, I’m sure Musk fired whomever is keeping score now. He’s going to launch the robotaxi stuff soon and it’s going to kill a bunch of people.
Even when it is just milliseconds before the crash, the computer turns itself off.
Later, Tesla brags that the autopilot was not in use during this ( terribly, overwhelmingly) unfortunate accident.
deleted by creator
There’s at least two steps before those three:
-1. Society has been built around the needs of the auto industry, locking people into car dependency
Most frustrating thing is, as far as I can tell, Tesla doesn’t even have binocular vision, which makes all the claims about humans being able to drive with vision only even more blatantly stupid. At least humans have depth perception. And supposedly their goal is to outperform humans?
Tesla’s argument of “well human eyes are like cameras therefore we shouldn’t use LiDAR” is so fucking dumb.
Human eyes have good depth perception and absolutely exceptional dynamic range and focusing ability. They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised, certainly more so than any computer added to a car.
And even with all those advantages humans have, we still crash from time to time and make smaller mistakes regularly.
A neural network that has been in development for 650 million years.