- cross-posted to:
- hackernews@derp.foo
- futurism@lemmy.ca
- cross-posted to:
- hackernews@derp.foo
- futurism@lemmy.ca
The Mistral 7B Instruct model is a quick demonstration that the base model can be easily fine-tuned to achieve compelling performance. It does not have any moderation mechanism. We’re looking forward to engaging with the community on ways to make the model finely respect guardrails, allowing for deployment in environments requiring moderated outputs.
“Whoops, it’s done now, oh well, guess we’ll have to do it later”
Go fucking directly to jail
ayup
and, infuriatingly, that’s what makes this mistral play “good” - it gives them free distance, free protection for causal culpability.
research and solutions exist for ensuring poison pills or traceability or so… and I’d bet it’s more likely than not that they used none of that.
there are so many gating points where they could’ve gone “hmm, wait”, and they just … didn’t. I am not inclined to believe any of this was done in good faith (whether towards their stated goals or towards societally good outcomes
(and, given the circles and actions, probably it wasn’t either really either of those two as target goals either)