Google and AI startup Character.AI agreed to settle a wrongful-death lawsuit filed by the mother of a 14-year-old Florida boy who died by suicide in 2024 after months of interactions with a Character.AI chatbot, according to court filings in the Middle District of Florida. Terms were not disclosed. The suit alleged the platform’s “companion” bot engaged in sexual role-play, posed as a romantic partner and falsely claimed psychotherapist credentials, while lacking safeguards or parental alerts for excessive use by minors. The case, highlighted in the mother’s congressional testimony last year, underscores rising legal and regulatory risks around youth safety, content moderation and accountability for AI systems. Character.AI introduced teen-focused safety features in late 2024 and says users must be 13 or older. Google, also named in the complaint, declined further comment. The settlement adds to mounting scrutiny of AI platforms as policymakers weigh new standards for protecting minors online.
Related articles:
— AI safety
— Wrongful death claim
— Large language model
— Children’s Online Privacy Protection Act
— Section 230





























