WhatsApp’s AI shows gun-wielding children when prompted with ‘Palestine’::By contrast, prompts for ‘Israeli’ do not generate images of people wielding guns, even in response to a prompt for ‘Israel army’

  • gayhitler420
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    i’m not sure of the point youre trying to make in this post.

    i’m well aware of how loosely language is used in the field of machine learning.

    in an attempt to stay on topic: training a model on a dataset can not and should not be described as “learning on its own”. no computer “decided” to do the “learning”. what happened is a person invoked commands to make the computer do a bunch of matrix arithmetic on a big dataset with a model as the output. later on, some other people queried the model and observed the output. that’s not learning on its own.

    let me give you an example: if i give ffmpeg a bunch of stuff to encode overnight, would you say my computer is making film on its own? of course you wouldn’t, that’s absurd.