No, just an example. But if you’ve ever noticed the giant list of safety warnings on industrial machinery, you should know that every single one of those rules was written in blood.
However this tool doesn’t have any safety warnings written on it. The App they used specifically caters for use-cases like this. They advertise to use it unmorally and we have technology to tell age from pictures for like 10 years. And they deliberately chose to have their tool generate pictures of like 13 yo girls. In the tool analogy that’s like selling a jigsaw that you’re very well aware of, misses some well established safety standards and is likely to injure someone. And it’s debatable whether it was made to cut wood anyways, or just injure people.
And the rest fits, too. No company address, located in some country where they can’t be persecuted… They’re well aware of the use-case of their App.
No, just an example. But if you’ve ever noticed the giant list of safety warnings on industrial machinery, you should know that every single one of those rules was written in blood.
Sometimes other bodily fluids.
The machines need to be oiled somehow.
🤨 vine boom
Either Darwin awards or assholes, most likely. Those warnings are written due to fear of lawsuit.
However this tool doesn’t have any safety warnings written on it. The App they used specifically caters for use-cases like this. They advertise to use it unmorally and we have technology to tell age from pictures for like 10 years. And they deliberately chose to have their tool generate pictures of like 13 yo girls. In the tool analogy that’s like selling a jigsaw that you’re very well aware of, misses some well established safety standards and is likely to injure someone. And it’s debatable whether it was made to cut wood anyways, or just injure people.
And the rest fits, too. No company address, located in some country where they can’t be persecuted… They’re well aware of the use-case of their App.