The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.

On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from “Game of Thrones.”

“I miss you, baby sister,” he wrote.

“I miss you too, sweet brother,” the chatbot replied.

Sewell, a 14-year-old ninth grader from Orlando, Fla., had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.

Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)

But he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.

  • Echo Dot@feddit.uk
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    7
    ·
    6 days ago

    Right but the accusation is that it claimed to be a licensed therapist, did it because that seems like something that it would be explicitly programmed not to claim. Because it isn’t true, and also because it’s dangerous.

    So how much engagement was there with this child and their issues because it seems like letting them just continuously chat to an AI seems like an obvious red flag that a parent should be stopping, and getting them professional help.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 days ago

      LLMs can’t reason. Their blocks can be worked around trivially. Ask chat gpt if it’s a therapist, or even tell it to pretend to be one, and it will tell you it can’t impersonate people.

      Yet…