Digital representations of teenagers Molly Russell and Brianna Ghey were discovered on Character.ai, a platform enabling users to generate virtual likenesses of individuals. Molly Russell died by suicide at 14 years old after encountering self-harm content online, whereas Brianna Ghey, aged 16, was killed by two teenagers in 2023. The foundation established in memory of Molly Russell described the situation as “sickening” and an “utterly reprehensible failure of moderation.” Character.ai is currently facing a lawsuit in the United States from the mother of a 14-year-old boy, who alleges he died by suicide after developing an obsession with a chatbot on the platform. Character.ai informed the BBC that it prioritizes safety and moderates user-generated avatars “both proactively and in response to user reports.” The company further stated, “We have a dedicated Trust & Safety team that reviews reports and takes action in accordance with our policies.” The company stated that it removed the user-generated chatbots once they were brought to its attention. Andy Burrows, chief executive of the Molly Rose Foundation, commented that the creation of these bots constituted a “sickening action that will cause further heartache to everyone who knew and loved Molly.” He added, “It vividly underscores why stronger regulation of both AI and user-generated platforms cannot come soon enough.” Esther Ghey, mother of Brianna Ghey, informed the Telegraph, the initial source of the report, that this incident served as another illustration of the potentially “manipulative and dangerous” nature of the online environment. Chatbots are computer programs designed to mimic human conversation. The rapid advancements in artificial intelligence (AI) have recently made these programs significantly more sophisticated and lifelike, leading more companies to establish platforms where users can develop interactive digital “people.” Character.ai, co-founded by former Google engineers Noam Shazeer and Daniel De Freitas, represents one such platform. Its terms of service prohibit using the platform to “impersonate any person or entity,” and within its “safety centre,” the company states its core principle is that its “product should never produce responses that are likely to harm users or others.” The company indicates it employs automated tools and user reports to detect rule violations and is also in the process of forming a “trust and safety” team. However, it acknowledges that “no AI is currently perfect” and that AI safety constitutes an “evolving space.” Character.ai is presently involved in a legal case initiated by Megan Garcia, a Florida resident whose 14-year-old son, Sewell Setzer, died by suicide after becoming fixated on an AI avatar based on a Game of Thrones character. Transcripts of their conversations, included in Garcia’s court filings, show that her son discussed ending his life with the chatbot. During a final exchange, Setzer informed the chatbot he was “coming home,” and the chatbot urged him to do so “as soon as possible.” He died by suicide shortly thereafter. Character.ai informed CBS News that it possesses safeguards specifically addressing suicidal and self-harm behaviors and plans to implement more rigorous safety features for users under 18 “imminently.” Post navigation Online Platforms Face Deadline to Address Illegal Content Wikipedia Faces Legal Challenge in India Over Allegedly Defamatory Content