• Nevoic
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    9
    ·
    edit-2
    7 months ago

    Like sure fuck Elon, but why do you think FSD is unsafe? They publish the accident rate, it’s lower than the national average.

    There are times where it will fuck up, I’ve experienced this. However there are times where it sees something I physically can’t because of either blindspots or pillars in the car.

    Having the car drive and you intervene is statistically safer than the national average. You could argue the inverse is better (you drive and the car intervenes), but I’d argue that system would be far worse, as you’d be relinquishing final say to the computer and we don’t have a legal system setup for that, regardless of how good the software is (e.g you’re still responsible as the driver).

    You can call it a marketing term, but in reality it can and does successfully drive point to point with no interventions normally. The places it does fuckup are consistent fuckups (e.g bad road markings that convey the wrong thing, and you only know because you’ve been on that road thousands of times). It’s not human, but it’s far more consistent than a human, in both the ways it succeeds and fails. If you learn these patterns you can spend more time paying attention to what other drivers are doing and novel things that might be dangerous (people, animals, etc ) and less time on trivial things like mechanically staying inside of two lines or adjusting your speed. Looking in your blindspot or to the side isn’t nearly as dangerous for example, so you can get more information.