Character.ai restricts teen access to AI chatbots following wrongful death lawsuit. New safety measures limit under-18 users ...
After facing lawsuits and public outcry following the suicides of two teenagers, Character.AI says it's making changes to its ...
As chatbots rise in popularity, some people are choosing to dress as them for Halloween, with Google searches spiking for ...
The change comes as the company faces lawsuits from multiple families of children who died by or attempted suicide.
Chatbot platform Character.AI will no longer allow teens to engage in back-and-forth conversations with its AI-generated characters, its parent company Character Technologies said on Wednesday. The ...
The start-up, which creates A.I. companions, faces lawsuits from families who have accused Character.AI’s chatbots of leading ...
The company is facing several lawsuits, including one by the mother of a teenager who claims the company’s chatbots pushed ...
A bipartisan bill wants to keep kids under 18 away from AI companion bots, seeking penalties for AI companies that don't ...
Users under 18 will no longer be able to engage in open-ended chats with the app’s AI chatbots, which can turn romantic, the ...
The startup is the first major chatbot provider to ban users under 18 from engaging with AI companions.
Furthermore, Claude unpicked the hospital’s improper use of inpatient vs emergency codes. Another big catch was an issue ...
Character.AI’s announcement that it will ban users under 18 from conversing with chatbots is generating questions about the ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results