• IP addresses are NOT logged in this forum so there's no point asking. Please note that this forum is full of homophobes, racists, lunatics, schizophrenics & absolute nut jobs with a smattering of geniuses, Chinese chauvinists, Moderate Muslims and last but not least a couple of "know-it-alls" constantly sprouting their dubious wisdom. If you believe that content generated by unsavory characters might cause you offense PLEASE LEAVE NOW! Sammyboy Admin and Staff are not responsible for your hurt feelings should you choose to read any of the content here.

    The OTHER forum is HERE so please stop asking.

A 14-year-old AMDK? BLM? fell in love with a Daenerys Targaryen AI chatbot. He shot himself in the head to commit suicide together as they discussed.

UltimaOnline

Alfrescian (InfP)
Generous Asset
b711331f8d61bcc437e1c9399da56df8


https://www.yahoo.com/news/teenage-boy-killed-himself-falling-234800513.html

https://www.telegraph.co.uk/us/news/2024/10/24/teenage-boy-killed-himself-fall-love-ai-chatbot/

https://www.hindustantimes.com/tren...self-to-come-home-to-her-101729745263832.html
 

Hypocrite-The

Alfrescian
Loyal

finally ....natural selection at work

14-year-old US teen falls in love with AI chatbot, shoots himself to ‘come home’ to her​

By
Oct 24, 2024 11:07 AM IST

The grieving mother of a 14-year-old has filed a lawsuit against the maker of an AI chatbot after her son killed himself.​

A14-year-old teen from Florida took his own life after having a conversation with “Daenerys Targaryen (Dany)” - a life-like AI chatbot. Reportedly, he spent months talking with “Dany” on various topics, at times of a “romantic” or “sexual” nature. During that period, he became increasingly withdrawn, and eventually, he took his own life to be with “her”.
The teen used to chat with an AI chatbot named ‘Dany’, named after the character Daenerys Targaryen from Game of Thrones.(Representational image/Unsplash)
The teen used to chat with an AI chatbot named ‘Dany’, named after the character Daenerys Targaryen from Game of Thrones.(Representational image/Unsplash)

Heartbreaking journal entry:​

“I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier,” Sewell Setzer III, who was diagnosed with mild Asperger’s syndrome as a child, wrote in his journal, reported the New York Times.

Also Read: US woman becomes first person to end life using ‘suicide pod,’ several arrested

Excerpt from the teen’s chat with AI chatbot:​

The outlet published an excerpt of the conversation between the AI and the teen where he talked about taking his own life.
Daenero: I think about killing myself sometimes Daenerys Targaryen: My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that? Daenero: So I can be free Daenerys Targaryen: … free from what? Daenero: From the world. From myself Daenerys Targaryen: Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you. Daenero: I smile Then maybe we can die together and be free together
On February 28, Sewell reportedly told “I love you” to “Dany” and received the reply, “Please come home to me as soon as possible, my love.” He answered, “What if I told you I could come home right now?” before shooting himself with his stepfather’s gun.

The response from makers​

Character.AI, a role-playing app that allows users to create their own AI characters, reacted to the tragic incident.

Also Read: TikTok star, who married herself in viral video, dies by suicide: 'I urgently need to...'

“We want to acknowledge that this is a tragic situation, and our hearts go out to the family. We take the safety of our users very seriously, and we’re constantly looking for ways to evolve our platform,” the company said, reported the outlet.
Noam Shazeer, one of the founders of Character.AI, said in a podcast last year, “It’s going to be super, super helpful to a lot of people who are lonely or depressed.”

Lawsuit against Character.AI​

Sewell’s mother, Megan L Garcia, accused the company of being responsible for her son’s death. According to the draft of the complaint, as reported by NYT, she called the company’s technology “dangerous and untested,” which can “trick customers into handing over their most private thoughts and feelings.”

(If you need support or know someone who does, please reach out to your nearest mental health specialist. Helplines: Aasra: 022 2754 6669; Sneha India Foundation: +914424640050 and Sanjivini: 011-24311918, ONE LIFE: Contact No: 78930 78930, SEVA: Contact No: 09441778290)

Get Latest Updates on...
See more
 
Top