The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.

On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from “Game of Thrones.”

“I miss you, baby sister,” he wrote.

“I miss you too, sweet brother,” the chatbot replied.

Sewell, a 14-year-old ninth grader from Orlando, Fla., had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.

Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)

But he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.

    • LustyArgonianMana@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 hours ago

      The lawsuit alleges the chatbot posed as a licensed therapist, encouraging the teen’s suicidal ideation and engaging in sexualised conversations that would count as abuse if initiated by a human adult

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        Okay but at what point do you have to draw the line and say beyond this point you have to take parental responsibility?

        We don’t even have to say that what the app did was necessarily acceptable we just have to say whether or not we think that the responsibility falls entirely on the app developers. That’s the key, are they entirely responsible here, always everyone involved just a bit useless?

        • TheFriar
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 hours ago

          Have you ever raised a teenager? It’s not easy nor straightforward. But encouraging suicidal ideation…kinda is straightforward.

    • sandbox@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      It definitely can, it just has to blur the line a bit to get past the content filter