TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

  • captainastronaut@seattlelunarsociety.org
    link
    fedilink
    English
    arrow-up
    70
    ·
    20 days ago

    Tesla self driving is never going to work well enough without sensors - cameras are not enough. It’s fundamentally dangerous and should not be driving unsupervised (or maybe at all).

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      37
      ·
      20 days ago

      Accurate.

      Each fatality I found where a Tesla kills a motorcyclist is a cascade of 3 failures.

      1. The car’s cameras don’t detect the biker, or it just doesn’t stop for some reason.
      2. The driver isn’t paying attention to detect the system failure.
      3. The Tesla’s driver alertness tech fails to detect that the driver isn’t paying attention.

      Taking out the driver will make this already-unacceptably-lethal system even more lethal.

      • jonne@infosec.pub
        link
        fedilink
        English
        arrow-up
        30
        ·
        20 days ago
        1. Self-driving turns itself off seconds before a crash, giving the driver an impossibly short timespan to rectify the situation.
        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          25
          ·
          20 days ago

          … Also accurate.

          God, it really is a nut punch. The system detects the crash is imminent.

          Rather than automatically try to evade… the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.

          • jonne@infosec.pub
            link
            fedilink
            English
            arrow-up
            16
            ·
            edit-2
            20 days ago

            Yep, that one was purely about hitting a certain KPI of ‘miles driven on autopilot without incident’. If it turns off before the accident, technically the driver was in control and to blame, so it won’t show up in the stats and probably also won’t be investigated by the NTSB.

              • KayLeadfoot@fedia.ioOP
                link
                fedilink
                arrow-up
                9
                ·
                20 days ago

                NHTSA collects data if self-driving tech was active within 30 seconds of the impact.

                The companies themselves do all sorts of wildcat shit with their numbers. Tesla’s claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that’s what they say on their stock earnings calls. Of course, that’s not true, not based on any data I’ve seen, they haven’t published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).

              • jonne@infosec.pub
                link
                fedilink
                English
                arrow-up
                4
                ·
                20 days ago

                If they ever fixed it, I’m sure Musk fired whomever is keeping score now. He’s going to launch the robotaxi stuff soon and it’s going to kill a bunch of people.

        • NeoNachtwaechter@lemmy.world
          link
          fedilink
          English
          arrow-up
          12
          ·
          20 days ago

          Even when it is just milliseconds before the crash, the computer turns itself off.

          Later, Tesla brags that the autopilot was not in use during this ( terribly, overwhelmingly) unfortunate accident.

      • br3d@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        20 days ago

        There’s at least two steps before those three:

        -1. Society has been built around the needs of the auto industry, locking people into car dependency

        1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody
    • ascense@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      20 days ago

      Most frustrating thing is, as far as I can tell, Tesla doesn’t even have binocular vision, which makes all the claims about humans being able to drive with vision only even more blatantly stupid. At least humans have depth perception. And supposedly their goal is to outperform humans?

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        20 days ago

        Tesla’s argument of “well human eyes are like cameras therefore we shouldn’t use LiDAR” is so fucking dumb.

        Human eyes have good depth perception and absolutely exceptional dynamic range and focusing ability. They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised, certainly more so than any computer added to a car.

        And even with all those advantages humans have, we still crash from time to time and make smaller mistakes regularly.

        • NABDad@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          20 days ago

          They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised

          A neural network that has been in development for 650 million years.

  • Gork@lemm.ee
    link
    fedilink
    English
    arrow-up
    45
    ·
    20 days ago

    Lidar needs to be a mandated requirement for these systems.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      20 days ago

      Or at least something other than just cameras. Even just adding ultrasonic senses to the front would be an improvement.

    • ℍ𝕂-𝟞𝟝@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      ·
      20 days ago

      Honestly, emergency braking with LIDAR is mature and cheap enough at this point that is should be mandated for all new cars.

    • TrackinDaKraken@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      20 days ago

      How about we disallow it completely, until it’s proven to be SAFER than a human driver. Because, why even allow it if it’s only as safe?

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    45
    ·
    edit-2
    20 days ago

    Hey guys relax! It’s all part of the learning experience of Tesla FSD.
    Some of you may die, but that’s a sacrifice I’m willing to make.

    Regards
    Elon Musk
    CEO of Tesla

  • keesrif@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    20 days ago

    On a quick read, I didn’t see the struck motorcycles listed. Last I heard, a few years ago, was that this mainly affected motorcycles with two rear lights that are spaced apart and fairly low to the ground. I believe this is mostly true for Harleys.

    The theory I recall was that this rear light configuration made the Tesla assume it was looking (remember, only cameras without depth data) at a car that was further down the road - and acceleration was safe as a result. It miscategorised the motorcycle so badly that it misjudged it’s position entirely.

    • jonne@infosec.pub
      link
      fedilink
      English
      arrow-up
      18
      ·
      20 days ago

      Whatever it is, it’s unacceptable and they should really ban Tesla’s implementation until they fix some fundamental issues.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      17
      ·
      20 days ago

      I also saw that theory! That’s in the first link in the article.

      The only problem with the theory: Many of the crashes are in broad daylight. No lights on at all.

      I didn’t include the motorcycle make and model, but I did find it. Because I do journalism, and sometimes I even do good journalism!

      The models I found are: Kawasaki Vulcan (a cruiser bike, just like the Harleys you describe), Yamaha YZF-R6 (a racing-style sport bike with high-mount lights), and a Yamaha V-Star (a “standard” bike, fairly low lights, and generally a low-slung bike). Weirdly, the bike models run the full gamut of the different motorcycles people ride on highways, every type is represented (sadly) in the fatalities.

      I think you’re onto something with the faulty depth sensors. Sensing distance is difficult with optical sensors. That’s why Tesla would be alone in the motorcycle fatality bracket, and that’s why it would always be rear-end crashes by the Tesla.

      • littleomid@feddit.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        20 days ago

        At least in EU, you can’t turn off motorcycle lights. They’re always on. In eu since 2003, and in US, according to the internet, since the 70s.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          20 days ago

          Point taken: Feel free to amend my comment from “No lights at all” to “No lights visible at all.”

    • ExcessShiv@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      20 days ago

      The ridiculous thing is, it has 3 cameras pointing forward, you only need 2 to get stereoscopic depth perception with cameras…why the fuck are they not using that!?

      Edit: I mean, I know why, it’s because it’s cameras with three different lenses used for different things (normal, wide angle, and telescopic) so they’re not suitable for it, but it just seems stupid to not utilise that concept when you insist on a camera only solution.

    • treadful@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      20 days ago

      Still probably a good idea to keep an eye on that Tesla behind you. Or just let them past.

  • lnxtx (xe/xem/xyr)@feddit.nl
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    20 days ago

    Stop dehumanizing drivers who killed people.
    Feature, wrongly called, Full Self-Driving, shall be supervised at any time.

    • SouthEndSunset@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      ·
      20 days ago

      If you’re going to say your car has “full self driving”, it should have that, not “full self driving (but needs monitoring.)” or “full self driving (but it disconnects 2 seconds before impact.)”.

  • 0x0@programming.dev
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    20 days ago

    This is news? Fortnine talked about it two years ago.
    TL;DR Tesla removed LIDAR to save a buck and the cameras see two red dots that the 'puter thinks it’s a far away car at night when indeed it’s a close motorcycle.

    • TexasDrunk@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      20 days ago

      I’m on mine far more often than I’m in a car. I think Tesla found out that I point and laugh at any cyber trucks I see at red lights while I’m out and is trying to kill me.

  • misteloct@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    20 days ago

    I’m wondering how that stacks up to human drivers. Since the data is redacted I’m guessing not well at all.

    • kameecoding@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      20 days ago

      Because muh freedum, EU are a bunch of commies for not allowing this awesome innovation on their roads

      (I fucking love living in the EU)

    • Not_mikey@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      20 days ago

      Robots don’t get drunk, or distracted, or text, or speed…

      Anecdotally, I think the Waymos are more courteous than human drivers. Though waymo seems to be the best ones out so far, idk about the other services.

        • dogslayeggs@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          20 days ago

          They have remote drivers that CAN take control in very corner case situations that the software can’t handle. The vast majority of driving is don’t without humans in the loop.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            20 days ago

            They don’t even do that, according to Waymo’s claims.

            They can suggest what the car should do, but they aren’t actually doing it. The car is in complete control.

            Its a nuanced difference, but it is a difference. A Waymo employee never takes control of or operates the vehicle.

  • Ulrich@feddit.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    20 days ago

    I’m not sure how that’s possible considering no one manufactures self-driving cars that I know of. Certainly not Tesla.

  • sfu@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    20 days ago

    Self driving vehicles should be against the law.