Teen’s mom sues Character.ai, alleging sexed-up bots led to son’s death

The 14-year-old boy’s last interaction was with a Character.ai chatbot before he tragically shot himself in the head in February, his mom alleged in a lawsuit filed on Oct. 22. 

Buy physical gold and silver online

AI companion chatbot company Character.ai has been sued by the mother of a teenage son after his suicide, blaming the chatbots for luring the boy into a sexually abusive relationship and even encouraging him to take his life.

The 14-year-old boy, Sewell Setzer, was targeted with “anthropomorphic, hypersexualized, and frighteningly realistic experiences” from Character.ai’s chatbots that purported to be a real person, a licensed psychotherapist and an adult lover to Setzer, ultimately resulting in him no longer wanting to live in reality, the mother’s attorneys alleged in the Oct. 22 lawsuit.

When one of the Game of Thrones-themed AI companions “Daenerys” asked Setzer whether he “had a plan” to commit suicide, Setzer said he did but wasn’t sure it would work, to which Daenerys responded:

Read more

About the author

Why invest in physical gold and silver?
文 » A