A 14-year-old US teen ended his life after deciding to “go home” and meet the “love” of his life, Daenerys Targaryen, a chatbot named after the leading character from the HBO drama series 'Game of Thrones'.
In February 2024, Sewell Setzer III, who fell in love with the AI chatbot, named Daenerys Targaryen, shot himself with his stepfather's gun. Due to her son's death, Megan Garcia, his mother, has sued the company this week. "Dangerous and untested," she said, the company's technology can "trick customers into handing over their most private thoughts and feelings."
US Teen fell in love with AI chatbot: Heartbreaking journal
“I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier,” Sewell Setzer III who has a kid was diagnosed with mild Asperger’s syndrome, wrote in his journal, as per the New York Times.
An excerpt of the interaction between the teen and the AI in which he discussed ending his life was released by the site. According to reports, Sewell on February 28 said “I love you” to “Dany” and got the reply, “Please come home to me as soon as possible, my love.” He answered, “What if I told you I could come home right now?” prior to shooting himself with his stepfather’s gun.
More about the Sewell Setzer III
The ninth-grader from Orlando, Florida had been conversing with a chatbot on Character.AI, an application that gives users "personalised AI." Users can chat with pre-existing characters or develop their own AI characters with the app. It had 20 million users up until last month.
In April 2023, the teenager began using Character.AI. Sewell had fallen for a chatbot, and his parents and friends were distant. However, according to the lawsuit, he "became obviously introverted, spent more and more time alone in his bedroom, and began suffering from low self-esteem." He even left the school basketball squad. According to the lawsuit, he was given a diagnosis of anxiety and disruptive mood disorder last year.
More From This Section
US Teen fell in love with AI chatbot: Response from makers
The role-playing app, Character.AI that permits users to make their own AI characters, reacted to the tragic incident. “We want to acknowledge that this is a tragic situation, and our hearts go out to the family. We take the safety of our users very seriously, and we’re constantly looking for ways to evolve our platform,” the company stated, reported the outlet.
One of the founders of Character.AI, Noam Shazeer stated in a podcast last year, “It’s going to be super, super helpful to a lot of people who are lonely or depressed.”
In order to "reduce the likelihood of encountering sensitive or suggestive content" for users under the age of 18, the business claimed it has implemented new safety features, such as pop-ups that send users to the National Suicide Prevention Lifeline if they express thoughts of self-harm.