Safe Streets Rebel’s protest comes after automatic vehicles were blamed for incidents including crashing into a bus and running over a dog. City officials in June said…

  • over_clox@lemmy.world
    link
    fedilink
    English
    arrow-up
    102
    arrow-down
    13
    ·
    1 year ago

    You make it sound like it’s a 50/50 split between human drivers and autonomous vehicles, which is definitely not the case.

    There are way more human drivers than autonomous vehicles. So, when an autonomous vehicle runs your child or pet over or whatever, who do you blame? The company? The programmers? The DMV for even allowing them on the road in the first place?

    What’s an autonomous vehicle do if it gets a flat? Park in the middle of the interstate like an idiot instead of pulling over and phone home for a mechanic?

    • donalonzo@lemmy.world
      link
      fedilink
      English
      arrow-up
      47
      arrow-down
      31
      ·
      edit-2
      1 year ago

      You need to first ask yourself if it more important to put blame than to minimize risk.

      “Autonomous vehicles could potentially reduce traffic fatalities by up to 90%.”

      “Autonomous vehicle accidents have been recorded at a slightly lower rate compared with conventional cars, at 4.7 accidents per million miles driven.”

      https://blog.gitnux.com/driverless-car-accident-statistics/

      • HedonismB0t@lemmy.ml
        link
        fedilink
        English
        arrow-up
        56
        arrow-down
        18
        ·
        1 year ago

        That opinion puts a lot of blind faith in the companies developing self driving and their infinitely altruistic motives.

          • rtxn@lemmy.world
            link
            fedilink
            English
            arrow-up
            48
            arrow-down
            14
            ·
            edit-2
            1 year ago

            It’s not a strawman argument, it is a fact. Without the ability to audit the entire codebase of self-driving cars, there’s no way to know if the manufacturer had knowingly hidden something in the code that might have caused accidents and fatalities too numerous to recount, but too important to ignore, that were linked to a fault in self-driving technology.

            I was actually trying to find an article I’d read about Tesla’s self-driving software reverting to manual control moments before impact, but I was literally flooded by fatality reports.

            • kep@lemmy.world
              link
              fedilink
              English
              arrow-up
              16
              arrow-down
              5
              ·
              1 year ago

              Strawman arguments can be factual. The entire point is that you’re responding to something that wasn’t the argument. You’re putting words in their mouth to defeat them instead of addressing their words at face value. It is the definition of a strawman argument.

            • HobbitFoot @thelemmy.club
              link
              fedilink
              English
              arrow-up
              21
              arrow-down
              10
              ·
              1 year ago

              We can’t audit the code for humans, but we still let them drive.

              If the output for computers driving is less than for humans and the computer designers are forced to be as financially liable for car crashes as humans, why shouldn’t we let computers drive?

              • Shayreelz@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                18
                ·
                1 year ago

                I’m not fully in either camp in this debate, but fwiw, the humans we let drive generally suffer consequences if there is an accident due to their own negligence

                • Obi@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  12
                  ·
                  1 year ago

                  Also we do audit them, it’s called a license. I know it’s super easy to get one in the US but in other countries they can be quite stringent.

                • HobbitFoot @thelemmy.club
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  8
                  ·
                  1 year ago

                  And I’m not denying it. However, it takes a very high bar to get someone convicted of vehicular manslaughter and that usually requires evidence that the driver was grossly negligent.

                  If you can show that a computer can drive as well as a sober human, where is the gross negligence?

              • rambaroo@lemmy.world
                link
                fedilink
                English
                arrow-up
                12
                ·
                edit-2
                1 year ago

                Because there’s no valid excuse to prevent us from auditing their software and it could save lives. Why the hell should we allow then to use the road if they won’t even let us inspect the engine?

                A car isn’t a human. It’s a machine, and it can and should be inspected. Anything less than that is pure recklessness.

                • HobbitFoot @thelemmy.club
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  1 year ago

                  Why the hell should we allow then to use the road if they won’t even let us inspect the engine?

                  How do you think a car gets approved right now? Do we take it apart? Do we ask for the design calculations of how they designed each piece?

                  That isn’t what happens. There is no “audit” of parts or the whole. Instead, there is a series of tests to determine road worthiness that everything in a car has to pass. We’ve already accepted a black box for the electronics of a car. You don’t need to get approval of your code to show that pressing the brake pedal causes the brake lights turn on; they just test it to make sure that it works.

                  We don’t audit the code already for life critical software already. It is all liability taken on by the manufacturers and verified via government testing of the finished product. What is an audit going to do when we don’t it already?

            • donalonzo@lemmy.world
              link
              fedilink
              English
              arrow-up
              11
              arrow-down
              7
              ·
              edit-2
              1 year ago

              It is most definitely a strawman to frame my comment as considering the companies “infinitely altruistic”, no matter what lies behind the strawman. It doesn’t refute my statistics but rather tries to make me look like I make an extremely silly argument I’m not making, which is the defintion of a strawman argument.

              • rambaroo@lemmy.world
                link
                fedilink
                English
                arrow-up
                7
                ·
                edit-2
                1 year ago

                The data you cited comes straight from manufacturers, who’ve repeatedly been shown to lie and cherry-pick their data to intentionally mislead people about driverless car safety.

                So no it’s not a straw man argument at all to claim that you’re putting inordinate faith in manufacturers, because that’s exactly what you did. It’s actually incredible to me how many of you are so irresponsible that you’re not even willing to do basic cross-checking against an industry that is known for blatantly lying about safety issues.

            • vinnymac@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              6
              ·
              1 year ago

              It may be the case that every line of code of all self driving vehicles is not available for a public audit. But neither is the instruction set of every human who was taught to drive properly on the road today.

              I would hope that through protesting and new legislation, that we will see the industry become more safe over time. Which we simply will never be able to achieve with human drivers.

        • IntoDaLagoon@lemmygrad.ml
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          4
          ·
          1 year ago

          What do you mean, I’m sure the industry whose standard practices include having the self-driving function turn itself off nanoseconds before a crash to avoid liability is totally motivated to spend the time and money it would take to fix the problem. After all, we live in a time of such advanced AI that all the news sites and magazines tell me we’re on the verge of the Singularity, and they’ve never misled me before.

          • Red Wizard 🪄@lemmygrad.ml
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            1 year ago

            I feel like I’m taking crazy pills because no on seems to know or give a shit that Tesla was caught red handed doing this. They effectively murdered those drivers.

        • biddy@feddit.nl
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          2
          ·
          1 year ago

          That wasn’t an opinion, it’s a statistic.

          No (large public) company ever has altruistic motives. They aren’t inherently good or bad, just machines driven by profit.

        • HobbitFoot @thelemmy.club
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          5
          ·
          1 year ago

          You don’t need to put faith into companies beyond the faith that is put into humans. Make companies just as financially liable as humans are, and you’ll still see a decrease in accidents.

          • xavier666
            link
            fedilink
            English
            arrow-up
            7
            ·
            1 year ago

            You mean those companies who will lobby and spend a fraction of their wealth to make those lawsuits disappear?

            • HobbitFoot @thelemmy.club
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              How is that different from the current system of large vehicular insurance companies spending a fraction of their wealth to make their lawsuits disappear?

              • xavier666
                link
                fedilink
                English
                arrow-up
                6
                ·
                1 year ago

                It’s no different at all. We should have stronger laws for such scenarios.

                • HobbitFoot @thelemmy.club
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  1 year ago

                  Ok, but in the context of letting computers drive, I feel like people want to enforce this perfect system of liability on automated systems where we already have an existing criminal and civil legal system as is that is designed to nowhere near the same standard for humans.

                  Why are we willing to say that it is unacceptable that no computer can kill people on the road when almost 43,000 die in the USA due to humans driving?

                  • rambaroo@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    edit-2
                    1 year ago

                    Uh, because software can be fixed and those deaths can be prevented? How the hell can you ask this question seriously? I can’t believe how many people are willing to blatantly shill for these companies, even if it gets people fucking killed.

                    And no you can’t claim to be saving lives because these driverless cars very often kill people in situations that a human driver would easily navigate.

                  • xavier666
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    arrow-down
                    1
                    ·
                    1 year ago

                    Why are we willing to say that it is unacceptable that no computer can kill people on the road when almost 43,000 die in the USA due to humans driving?

                    This part is bogus to me as well. My friend who used to work in self-driving said that when self driving can be “just” better than human driving, technology has won. In statistical terms, it means having slightly lesser fatalities than humans (<43k fatalities with respect to the num of human drivers).

                    Now it’s up for debate lesser by how much exactly. Just 5% reduction or 50% reduction. If we want to go for 99% reduction, we should stop building self-driving tech altogether.

      • kewjo@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        are there actual datasets to look at and info regarding how data was collected? all the sources on that page are just domain links but don’t appear to point to the data making the claims?

        4.7 accidents per million miles doesn’t mean much if the cars are limited to specific roads or include test tracks that give them an advantage. the degree of variance in different environments would also need to be measured such as weather effects, road conditions and traffic patterns.

        I’m all for autonomous driving, but its not like companies don’t fudge numbers all the time for their benefit.

      • over_clox@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        1 year ago

        So…

        Your car is at fault. Their kid is dead.

        Who pays for the funeral?

        Does your insurance cover programming glitches?

        • HumbertTetere@feddit.de
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          1
          ·
          1 year ago

          If your insurance determined that an autonomous vehicle will cause less damage over time than a human driver, they will do that, yes.

          • over_clox@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            5
            ·
            1 year ago

            Autonomous logic doesn’t pay insurance, does it?

            If so, who TF is paying the insurance behind the scenes, and who is responsible?

            • HumbertTetere@feddit.de
              link
              fedilink
              English
              arrow-up
              9
              ·
              1 year ago

              If so, who TF is paying the insurance behind the scenes

              The owner of the vehicle is probably very openly paying.

              • Flying Squid@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Here’s a question- if you have to agree to terms of service for the vehicle to function, and I’m guessing you would, is it really your vehicle?

              • over_clox@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                5
                ·
                edit-2
                1 year ago

                We’re talking about autonomous vehicles here, no driver, company owned.

                So is Alphabet responsible?

                Do your homework, these vehicles are owned by the parent company of Google and Apple, Alphabet. These vehicles have no private owner. So again, who TF is responsible?

                • HumbertTetere@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  arrow-down
                  1
                  ·
                  1 year ago

                  So what? It’s not the gotcha you apparently believe to have found, companies can have insurance…

                • Whirlybird@aussie.zone
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  1 year ago

                  these vehicles are owned by the parent company of Google and Apple, Alphabet.

                  Alphabet don’t own Apple.

                  • over_clox@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    5
                    ·
                    1 year ago

                    I’ll take your word on that. I’ve edited my comment to reflect that, but last research I did a few years ago, both companies were under the umbrella of Alphabet.

        • CoderKat
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          I mean, why shouldn’t it? Is a programming glitch in a self driving all that different from a mechanical issue in a manually driven car?

          • over_clox@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            AI driven cars are just as prone to mechanical issues as well. Is AI smart enough to deal with a flat tire? Will it pull over to the side of the road before phoning in for a mechanic, or will it just ignorantly hard stop right in the middle of the interstate?

            What’s AI do when there’s a police officer directing traffic around an accident or through a faulty red light intersection? I’ve literally seen videos on that before, AI couldn’t give two shits about a cop’s orders as to which way to drive the vehicle.

      • over_clox@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        9
        ·
        1 year ago

        Story time…

        I once had a crazy accident driving only like 15-20 MPH or so down a side road, then about 20 feet in front of me some idiot backed out of his parking spot right in front of me.

        Broad daylight, overcast skies, no other vehicles blocking his view even. Dude just backed up without looking like a freaking idiot.

        I responded in a split second. I did not hit the brakes, as I knew I didn’t have enough time or distance to stop. If I had hit the brakes, his car would have had more time to back out further and I would have smacked straight on into the passenger side of his car.

        Instead of hitting the brakes, I quickly jerked the steering wheel hard and fast to the left. See, I knew an impact was inevitable at that point, I made that move to clip his bumper instead of smacking into the passenger side and ruining both vehicles.

        Would an AI do that? 🤔

          • over_clox@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            14
            ·
            1 year ago

            They tend to work on basic sensors and simplified logic. They don’t tend to consider forward momentum and a vehicle pulling out perpendicular in front of you.

            I believe half the programmers of autonomous vehicles never even drove a vehicle in their life.

        • SCB@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          It’s weird that you think this isn’t the suggested driving practice in such an instance