• lonewalk
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    I read the article, and stand by my statement - “AI” does not apply to self driving cars the same way as robotics use by law enforcement. These are two separate categories of problems where I don’t see how some unified frustration at AI or robotics applies.

    Self driving cars have issues because the machine learning algorithms used to train them are not sufficient to navigate the complexities of roads, and there is no human fallback. (See: autopilot)

    Robotics use by law enforcement has issues because it removes a human factor to enforcement, which has concerns of whether any deadly force is ever justified when used (does a suspect pose a danger to any officer if there is no human contact?), and worries of dehumanization exist here, as well as other factors like data collection. These aren’t even self driving mostly, from what I understand law enforcement remote pilots them.

    these are separate problem spaces and aren’t deadly in the same ways, aren’t unattractive in the same ways, and should be treated and analyzed as distinct problems. by reducing to “AI” and “robots” you create a problem that makes sense only to the technically uninclined, and blurs any meaningful discussion about the precisions of each issue.