Personality.AI, which was accused last week of “manipulating” a teenage boy into ending his own life, also made it possible for users to build chatbots that mimicked Molly Russell, a teenager.
In November 2017, at the age of 14, Molly committed suicide after reading internet messages on anxiety, sadness, and suicide.
The Telegraph newspaper conducted an investigation and found the chatbots.
According to Brianna Ghey’s mother, Esther Ghey, “this is yet another example of how manipulative and dangerous the online world can be for young people,” and she urged those in authority to “protect children” from it.
The article claims that a Character.AI bot claimed to be a “expert on the final years of Molly’s life” after mispronouncing her name slightly and utilizing her picture.
The sight of Character is a gut punch.Andy Burrows, the director of the Molly Rose Foundation, a charity founded by the teen’s family and friends after her death, stated, “AI demonstrates a complete lack of responsibility, and it vividly underscores why stronger regulation of both AI and user-generated platforms cannot come soon enough.”