That’s just Puritanical hogwash though, the same crowd of voices come out with the same baseless assumptions and stretched one-sided logic about everything.
When video games came out the puritans all said violent video games would ruin the world but violence has continued to decrease YOY, studies show that they can actually help people regulate their temper and foster better emotional intelligence. Learning to handle stressful situations is something modern culture took out of our lives and video games created a safe way to replace, our brains need practice at identifying anger and frustration and channeling then correctly.
AI sex partners are likely a very similar thing, by allowing safe exploration and somewhere to develop experience and confidence communicating it’s very likely helping a lot of people establish the skills needed to find and flourish in a healthy human relationship.
Really though they’re just a bit of silly fun which doesn’t hurt anyone, if someone can read Jane Austen and Julie Cooper without turning into a rabid fiend then they can have a cumbersome duty conversion with a LLM. Women aren’t going to become self-pleasure addicted recluses because of sex bots if they weren’t already that way due to the Sims and late night adult only discord chat rooms.
Of course people who spend their whole lives fighting themselves because of some twisted puritanical claptrap they’ve had rammed down their throats are likely to fly off the rails at any moment, if you don’t come to understand yourself and your biological impulses then you’ll never be able to control yourself - most people aren’t using AI sex bots to shamefully fill a void they dare not comprehend, they’re not experiencing a genuinely non-judgemental conversation for the first time, they’re not racked with sadness because their life of avoiding joy has left them cold and isolated.
The fact puritans are scared of literally everyone fun only speaks of the puritan mentality and says nothing of the value of video games, porn, AI sex robots, or any of the other things they rally against.
(And yes I have a deep sadness and recognise many do, messing around with fun distractions to that is just the human condition and the driving force of many great developments)
Ah, and playing video games 12 hours a day will harm you.
Sexuality is a very important part of human existence and an AI partner can provide not only safe exploration, because safety in sex is not only about avoiding tripper FFS.
Anything for 12 hours a day is bad for you; reading the Bible, studying math, working out in the gym…
I’ve tried being very attentive to your final paragraph but I can’t parse it, what is tripper? And and did you mean it can’t provide safety or that it can provide it plus something else (a tripper maybe?)
Not sure I agree about training or walking for 12 hours.
OK, my ability to communicate in my own language as well as others differs with mood.
I meant that an AI is as dangerous as a human is in those kinds of communication it does imitate. Thus it’s not safe exploration, it’s just exploration. And that it’s safe only in regards to some diseases, which was a joke or an irony if you will.
EDIT: Ah, tripper is a loanword for gonorrhea in Russian, I suppose it’s not from English, German likely.
If you walk 12 hours a day then you can’t work, socialise, eat, etc etc etc which are all the same problems as when doing anything else for 12 hours.
And of course there’s many more dangers than disease, which of them do you think exist when talking to a simulation? You’re not going to get scammed, taken advantage of, bullied, manipulated, raped, murdered, humiliated, impregnated, or any of the bad stuff that can happen due to normal flirting if you’re using a well regulated AI.
The solution to all the potential issues is communication and education, make sure people understand and can see that it’s just a toy rather than a real person and that it’s generating fiction rather than catching feelings.
Well regulated - yeah, maybe. Not sure how’s that regulation going to work, by what I know about AIs. And there are many things not normal which we may yet discover via such AIs.
useless offtopic about “manipulative”
Just recently got blocked by one girl (she actually waited for an RL meeting for that) saying I was manipulative (I meant that what I can understand of her mind is beautiful, but I’m sad that it’s rare, while she got it as her mind being beautiful only when it interacts with mine). In fact I used an ambiguous choice of words for a compliment and didn’t edit it, cause she answered something positive I didn’t want to spoil after (about a picture with apple trees blooming), I didn’t think she’d change her mind on that phrase. I also didn’t expect this much harm, cause I myself, when afraid of something bad from other person, usually spend a lot of time and effort and humiliation to confirm it. Felt like dying yesterday and the day before, and it’s more like “tired of dry wailing” today.
Guess the point is that for people like her AIs are good, but unnecessary, and for people like me they can be harmful even if behave conventionally fine.
That’s just Puritanical hogwash though, the same crowd of voices come out with the same baseless assumptions and stretched one-sided logic about everything.
When video games came out the puritans all said violent video games would ruin the world but violence has continued to decrease YOY, studies show that they can actually help people regulate their temper and foster better emotional intelligence. Learning to handle stressful situations is something modern culture took out of our lives and video games created a safe way to replace, our brains need practice at identifying anger and frustration and channeling then correctly.
AI sex partners are likely a very similar thing, by allowing safe exploration and somewhere to develop experience and confidence communicating it’s very likely helping a lot of people establish the skills needed to find and flourish in a healthy human relationship.
Really though they’re just a bit of silly fun which doesn’t hurt anyone, if someone can read Jane Austen and Julie Cooper without turning into a rabid fiend then they can have a cumbersome duty conversion with a LLM. Women aren’t going to become self-pleasure addicted recluses because of sex bots if they weren’t already that way due to the Sims and late night adult only discord chat rooms.
Of course people who spend their whole lives fighting themselves because of some twisted puritanical claptrap they’ve had rammed down their throats are likely to fly off the rails at any moment, if you don’t come to understand yourself and your biological impulses then you’ll never be able to control yourself - most people aren’t using AI sex bots to shamefully fill a void they dare not comprehend, they’re not experiencing a genuinely non-judgemental conversation for the first time, they’re not racked with sadness because their life of avoiding joy has left them cold and isolated.
The fact puritans are scared of literally everyone fun only speaks of the puritan mentality and says nothing of the value of video games, porn, AI sex robots, or any of the other things they rally against.
(And yes I have a deep sadness and recognise many do, messing around with fun distractions to that is just the human condition and the driving force of many great developments)
You haven’t read my comment attentively.
Ah, and playing video games 12 hours a day will harm you.
Sexuality is a very important part of human existence and an AI partner can provide not only safe exploration, because safety in sex is not only about avoiding tripper FFS.
Anything for 12 hours a day is bad for you; reading the Bible, studying math, working out in the gym…
I’ve tried being very attentive to your final paragraph but I can’t parse it, what is tripper? And and did you mean it can’t provide safety or that it can provide it plus something else (a tripper maybe?)
Not sure I agree about training or walking for 12 hours.
OK, my ability to communicate in my own language as well as others differs with mood.
I meant that an AI is as dangerous as a human is in those kinds of communication it does imitate. Thus it’s not safe exploration, it’s just exploration. And that it’s safe only in regards to some diseases, which was a joke or an irony if you will.
EDIT: Ah, tripper is a loanword for gonorrhea in Russian, I suppose it’s not from English, German likely.
If you walk 12 hours a day then you can’t work, socialise, eat, etc etc etc which are all the same problems as when doing anything else for 12 hours.
And of course there’s many more dangers than disease, which of them do you think exist when talking to a simulation? You’re not going to get scammed, taken advantage of, bullied, manipulated, raped, murdered, humiliated, impregnated, or any of the bad stuff that can happen due to normal flirting if you’re using a well regulated AI.
The solution to all the potential issues is communication and education, make sure people understand and can see that it’s just a toy rather than a real person and that it’s generating fiction rather than catching feelings.
Well regulated - yeah, maybe. Not sure how’s that regulation going to work, by what I know about AIs. And there are many things not normal which we may yet discover via such AIs.
useless offtopic about “manipulative”
Just recently got blocked by one girl (she actually waited for an RL meeting for that) saying I was manipulative (I meant that what I can understand of her mind is beautiful, but I’m sad that it’s rare, while she got it as her mind being beautiful only when it interacts with mine). In fact I used an ambiguous choice of words for a compliment and didn’t edit it, cause she answered something positive I didn’t want to spoil after (about a picture with apple trees blooming), I didn’t think she’d change her mind on that phrase. I also didn’t expect this much harm, cause I myself, when afraid of something bad from other person, usually spend a lot of time and effort and humiliation to confirm it. Felt like dying yesterday and the day before, and it’s more like “tired of dry wailing” today.
Guess the point is that for people like her AIs are good, but unnecessary, and for people like me they can be harmful even if behave conventionally fine.