Lawyers for a man charged with murder in a triple homicide had sought to introduce cellphone video enhanced by machine-learning software.

A Washington state judge overseeing a triple murder case barred the use of video enhanced by artificial intelligence as evidence in a ruling that experts said may be the first-of-its-kind in a United States criminal court.

The ruling, signed Friday by King County Superior Court Judge Leroy McCullogh and first reported by NBC News, described the technology as novel and said it relies on “opaque methods to represent what the AI model ‘thinks’ should be shown.”

“This Court finds that admission of this Al-enhanced evidence would lead to a confusion of the issues and a muddling of eyewitness testimony, and could lead to a time-consuming trial within a trial about the non-peer-reviewable-process used by the AI model,” the judge wrote in the ruling that was posted to the docket Monday.

The ruling comes as artificial intelligence and its uses — including the proliferation of deepfakes on social media and in political campaigns — quickly evolve, and as state and federal lawmakers grapple with the potential dangers posed by the technology.

  • sylver_dragon@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    7 months ago

    This seems like one of those technologies which may be useful as an investigatory tool, but should ultimately not admissible in court. For example, if law enforcement has a grainy video of a crime, and they use AI enhancement to generate leads, that could be ok. Though, it will still have issues with bias and false leads; so, such usage should be tracked and data kept on it to show usefulness and bias. But, anything done to a video by AI should almost universally be considered suspect. AI is really good at making up plausible results which are complete bullshit.