• assassin_aragorn@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    I would actually say yes, the Tesla poses more risk. Driving safety is all about anticipating what the other drivers are going to do. After commuting in Houston for 2-3 years, I actually became quite good at identifying scenarios where something dangerous could happen. I wasn’t always right if they were actually going to happen, but I was always prepared to take action in case it was. For instance, if the positioning is right for someone to suddenly cut you off, you can hang back and see if they’ll actually do it. If a larger car is next to you and you’re both making a turn, you can be wary of it spilling into your lane. I avoided a collision today actually because of that.

    We have a sense of what human drivers might do. We don’t have that sense for self driving cars. I can’t adequately predict when I need to take defensive actions, because their behavior is totally foreign to me. They may run a red light well after it’s turned red, while I would expect a human to only do that if it had recently changed. It’s very rare for someone to run a red when they pull up to a light that they’ve only seen as red.

    This same concept is why you can’t make a 100% safe self driving car. Driving safety is a function of everyone on the road. You could drive as safely as possible, but you’re still at the mercy of everyone else’s decisions. Introducing a system that people aren’t familiar with will create a disruption, and disruptions cause accidents.

    Everyone has to adopt self driving technology at about the same time. When it’s mostly self driving cars, it can be incredibly safe. But that in between where it isn’t fully adopted is an increase in risk.

    • LittleLordLimerick
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      This same concept is why you can’t make a 100% safe self driving car. Driving safety is a function of everyone on the road. You could drive as safely as possible, but you’re still at the mercy of everyone else’s decisions. Introducing a system that people aren’t familiar with will create a disruption, and disruptions cause accidents.

      Again, we don’t need a 100% safe self driving car, we just need a self driving car that’s at least as safe as a human driver.

      I disagree with the premise that humans are entirely predictable on the road, and I also disagree that self driving cars are less predictable. Computers are pretty much the very definition of predictable: they follow the rules and don’t ever make last minute decisions (unless their programming is faulty), and they can be trained to always err on the side of caution.