Personality.In response to lawsuits alleging its chatbots contributed to child suicide and self-harm, AI, formerly one of Silicon Valley’s most promising AI businesses, introduced additional safety precautions on Thursday to protect adolescent users.
The California-based business, which was started by former Google engineers, is one of several companies that sell AI companions, or chatbots that mimic human interactions to give amusement, discussion, and emotional support.
A mother accused the platform of being responsible for her 14-year-old son’s suicide in a lawsuit filed in Florida in October.
The teenager, Sewell Setzer III, had expressed a wish to end his life after developing a close bond with a chatbot that was modeled after the “Game of Thrones” heroine Daenerys Targaryen.