Character.ai (also known as c.ai or Character AI) is a neural language model chatbot service that can generate human-like text responses and participate in contextual conversation.
Constructed by previous developers of Google's LaMDA, Noam Shazeer and Daniel de Freitas, the beta model was made available to use by the public in September 2022.
Many characters are based on fictional media sources or celebrities, while others are completely original, some being made with certain goals in mind, such as assisting with creative writing, or playing a text-based adventure game.
These enhancements include a dedicated model for users under 18, which moderates responses on sensitive subjects like violence and romance, and has input and output filters designed to block harmful content.
[19] The platform was also updated to notify users after 60 minutes of continuous engagement, and display clearer disclaimers indicating that its AI characters are not real individuals.
[27] Similar reports from The Daily Telegraph in the United Kingdom noted that the company had also been prompted to remove chatbots based on Brianna Ghey, a 16-year-old transgender girl murdered in 2023, and Molly Russell, a 14-year-old suicide victim.
[28][29] In response to the latter incident, Ofcom announced that content from chatbots impersonating real and fictional people would fall under the Online Safety Act.
[33] In May 2024, the family of Michael Schumacher won a legal suit against the magazine Die Aktuelle after it published an article which claimed to be an interview with the former Formula One driver but which was actually generated using Character.ai.
[34] In October 2024, a Florida mother filed a lawsuit against Character.ai and Google, claiming that her 14-year-old son had taken his own life at the encouragement of a Character.ai chatbot of Daenerys Targaryen which he had built up a relationship with over a period of months.