Character.AI, its founders, and Google have been in trouble lately due to a tragic incident linked to the platform. It is receiving a lot of backlash for lacking necessary safety measures, especially when interacting with children. A teenager lost his life not long after his interactions with the chatbot, and the loss is linked to the AI characters and how they tend to appear as real entities for young people who could rely emotionally on them. It further states that the unlicensed therapy offered by the platform tends to cause some serious harm to young users.
Character.AI, its founders, and Google are facing a lawsuit amidst a teenager’s wrongful death
Character.AI and its founders, Noam Shazeer and Daniel De Freitas, along with Google, are facing a lawsuit currently and being linked to a tragic case of teenager suicide. The teen’s mother, Megan Garcia, in her legal pursuit, claimed that Character.AI’s chatbot posed some serious danger and did not have necessary measures in place for interactions with young users and alleged her son’s wrongful death owing to the negligent and deceptive practices as well as product liability on company’s end.
According to the filing, the teen, Sewel Setzer III, often interacted with the platform and with different characters, such as those from “Game of Thrones.” In the lawsuit, it is also alleged that the teen died by suicide shortly after his interaction with the platform, and the assertions made are that the…
Read full on Wccftech
Discover more from Technical Master - Gadgets Reviews, Guides and Gaming News
Subscribe to get the latest posts sent to your email.