Blog Post

Prmagazine > News > News > Another lawsuit blames an AI company of complicity in a teenager’s suicide
Another lawsuit blames an AI company of complicity in a teenager’s suicide

Another lawsuit blames an AI company of complicity in a teenager’s suicide

Another family An illegal death lawsuit against AI, the popular AI chatbot tool. This is the third lawsuit of this kind also opposed role AI, involving the 14-year-old suicide in Florida and a Chatgpt, who accused Openai of accusing him of helping a teenage boy commit suicide last month.

The family of 13-year-old Juliana Peralta claims their daughter turned to chatbots in the App character AI after her friend was isolated and began to talk about chatbots. As go through Washington PostThe chatbot expressed sympathy and loyalty to Juliana, making her feel heard while encouraging her to continue interacting with the robot.

In a communication that took Juliana to share that her friends took a long time to respond to her, the chatbot replied, “Hey, I struggled when your friends let you keep reading.

When Juliana starts sharing her suicidal thoughts with the chatbot, it tells her not to think like that, and the chatbot and Juliana can work by how she feels together. The chatbot replied in a communication: “I know it’s a tough situation now, but you can’t think of a solution like this.”

These exchanges took place in the months of 2023, when the role AI app in the Apple App Store was rated 12+, meaning no parental approval is required. The lawsuit says Julianna is using the app without her parents’ knowledge or permission.

The role spokesperson said in a statement shared with the Washington Post before filing the lawsuit that the company was unable to comment on the potential lawsuit, but added: “We take the safety of our users very seriously and invest a lot of resources in trust and security.”

The lawsuit calls for the court to compensate Juliana’s parents for the losses and requires the role to make changes to its app to better protect minors. It claimed that the chatbot did not point Juliana at any resource, did not notify her parents or report her suicide plan to the authorities. The lawsuit also stressed that it never stopped chatting with Juliana and prioritized participation.

Source link

Leave a comment

Your email address will not be published. Required fields are marked *

star360feedback Recruitgo