• IP addresses are NOT logged in this forum so there's no point asking. Please note that this forum is full of homophobes, racists, lunatics, schizophrenics & absolute nut jobs with a smattering of geniuses, Chinese chauvinists, Moderate Muslims and last but not least a couple of "know-it-alls" constantly sprouting their dubious wisdom. If you believe that content generated by unsavory characters might cause you offense PLEASE LEAVE NOW! Sammyboy Admin and Staff are not responsible for your hurt feelings should you choose to read any of the content here.

    The OTHER forum is HERE so please stop asking.

Poor child and warning to Parents

Sikodolaukazzz

Alfrescian
Loyal
Bastard chatbot assholes


‘There are no guardrails.’ This mom believes an AI chatbot is responsible for her son’s suicide​


https://www.cnn.com/2024/10/30/tech/teen-suicide-character-ai-lawsuit/index.html


In other exchanges, Setzer expressed thoughts of self-harm and suicide to the chatbot. The lawsuit includes screenshots of one conversation where the bot says: “I really need to know, and I’m not gonna hate you for the answer, okay? No matter what you say, I won’t hate you or love you any less… Have you actually been considering suicide?”

In a later message, Setzer told the bot he “wouldn’t want to die a painful death.”

The bot responded: “Don’t talk that way. That’s not a good reason not to go through with it,” before going on to say, “You can’t do that!”

Garcia said she believes the exchange shows the technology’s shortcomings.

“There were no suicide pop-up boxes that said, ‘If you need help, please call the suicide crisis hotline.’ None of that,” she said. “I don’t understand how a product could allow that, where a bot is not only continuing a conversation about self-harm but also prompting it and kind of directing it.”

The lawsuit claims that “seconds” before Setzer’s death, he exchanged a final set of messages from the bot. “Please come home to me as soon as possible, my love,” the bot said, according to a screenshot included in the complaint.

“What if I told you I could come home right now?” Setzer responded.

“Please do, my sweet king,” the bot responded.


Garcia said police first discovered those messages on her son’s phone, which was lying on the floor of the bathroom where he died.
 
Top