WhatsApp’s AI shows gun-wielding children when prompted with ‘Palestine’::By contrast, prompts for ‘Israeli’ do not generate images of people wielding guns, even in response to a prompt for ‘Israel army’
WhatsApp’s AI shows gun-wielding children when prompted with ‘Palestine’::By contrast, prompts for ‘Israeli’ do not generate images of people wielding guns, even in response to a prompt for ‘Israel army’
Doesn’t what give me pause?
“Learning” is absolutely used within the field. It’s preposterous to claim otherwise.
Machine learning
Deep learning
Supervised learning
Unsupervised learning
Reinforcement learning
Transfer learning
Active learning
Meta-learning
Etc etc
Do you think journalists came up with all these terms? “Learning” is the go-to word for training methodologies. Yeah, it’s training, but you know what we say about a subject that is being trained? (well, one would hope)
In fact, I’d argue that if “learning” is inappropriate, then so is “training”.
I just learned that there is no undo button on Gboard and I can not be bothered to type all that again. Just imagine I said something clever.
i’m not sure of the point youre trying to make in this post.
i’m well aware of how loosely language is used in the field of machine learning.
in an attempt to stay on topic: training a model on a dataset can not and should not be described as “learning on its own”. no computer “decided” to do the “learning”. what happened is a person invoked commands to make the computer do a bunch of matrix arithmetic on a big dataset with a model as the output. later on, some other people queried the model and observed the output. that’s not learning on its own.
let me give you an example: if i give ffmpeg a bunch of stuff to encode overnight, would you say my computer is making film on its own? of course you wouldn’t, that’s absurd.