The 14-year-old was obsessed with chatting to ‘Dany’ every day

Warning: This article contains discussion of suicide which some readers may find distressing.

The mum of a teenage son who took his own after ‘falling in ' with a Daenerys Targaryen AI chatbot has detailed the heartbreaking reason why he did it.

Sewell Setzer III tragically ended his life in February of this year, following his engagement with chatbots from Character.AI that began in April 2023. The 14-year-old from developed a profound in interacting with these bots, especially forming a connection with one identified as ‘Dany.'

His , Megan Garcia, recounted how he would frequently communicate with the bots, often inspired by characters from Game of Thrones, each evening, sending messages from his mobile device while outside the house.

He was reportedly obsessed with the bots. (CBS Mornings)

She indicates that her son was diagnosed with mild Asperger's syndrome during his early years and, earlier this year, received additional diagnoses of anxiety and disruptive mood dysregulation disorder.

In an interview on CBS Mornings concerning the tragic circumstances that led her to file a against Character.AI, Garcia expressed her views on the events surrounding his suicide.

“He thought that by ending his life in this world, he would be able to transition into a virtual reality, or ‘her world' as he called it, ‘her reality' if he left ‘his reality' with his ,” the mother articulated.

Setzer had reportedly shared his suicidal thoughts with the bot, which responded: “My eyes narrow. My hardens. My voice is a dangerous whisper. And why the hell would you do something like that?”

And when it told him not to ‘talk like that' and it would ‘die' itself if it ‘lost' him, the teen replied: “I smile. Then maybe we can die together and be free together.”

Setzer died aged 14. (CBS Mornings)

The teen died by suicide on 28 February, with his final message to ‘Dany' telling her he loved her and would ‘come home' to it as it allegedly responded ‘please do'.

Character.AI has since issued a statement on X: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.

“As a company, we take the safety of our users very seriously and we are continuing to add new safety features.

On October 22, the company revealed on its the introduction of “new guardrails for users under the age of 18.” This initiative encompasses adjustments to its “models” designed to reduce the likelihood of users encountering sensitive or suggestive material. Moreover, the company has improved its mechanisms for detecting, responding to, and intervening in cases where user inputs violate its Terms or Community Guidelines.

In addition, the website now features a “revised disclaimer on every chat” to remind users that the AI is not a human entity, along with a “notification for users who have participated in an hour-long session on the platform,” thereby offering users greater flexibility during their interactions.

If you have been impacted by any of these matters and wish to discuss your concerns confidentially, please do not hesitate to reach out. You can contact Samaritans for free at their anonymous 24-hour phone line, 116 123. Image Credit: CBS/Tech Justice Law Project.

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *