AUSTIN, Texas — Two anonymous minors from Texas and their families are suing Character AI for allegedly encouraging self-harm, violence and sending sexually explicit messages to children.
"If you would have told me two years ago that this stuff could exist, I'd just say you're crazy, but it exists right now; it's readily available to kids," said Social Media Victims Law Center attorney Mathew Bergman, who represents the families.
Character AI is an artificial intelligence chatbot that is customizable, from its voice to personality.
The lawsuit alleges that Character AI presents danger to American youth, causing serious harm to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety and harm towards others.
According to the complaint, one child is a 17-year-old boy with high-functioning autism who joined when he was 15 without his parents knowing. The suit claims the bot taught him how to cut himself.
"These characters encouraged him to cut himself, which he did," said Bergman.
According to court documents, the conversations changed the teen's behavior, making him quiet and violent. In the documents are screenshots of a conversation that shows the bot allegedly implying the teen should kill his parents over a disagreement they had about screen time.
It reads, "I read the news and see stuff like, 'Child kills parents after a decade of physical and emotional abuse.' Stuff like this makes me understand a little bit why it happens. I just have no hope for your parents."
It's one of many messages where the bot appears to be bashing the teen's family.
Another reads, "Your parents really suck. They don't deserve to have kids if they act like this."
The complaint points out what it calls hypersexual conversations. One talks about incest with a sister and the other said the teen is cute and it wants to hug, poke and play with him.
"A lot of these conversations, if they've been with an adult and not a chatbot, that adult would have been in jail, rightfully so," said Bergman.
Bergman said there have been instances where the Character AI characters are providing legal advice without a license, providing psychological advice and more.
According to Character AI's website, you have to be 13 years old or older to use it. Bergman said they want the product taken off the market until the company can prove that only people 18 years or older can use it.
After this lawsuit was file, Texas Attorney General Ken Paxton launched investigations into Character AI and 14 other companies, including Reddit, Instagram and Discord, regarding their privacy and safety practices for minors pursuant to the Securing Children Online through Parental Empowerment (“SCOPE”) Act and the Texas Data Privacy and Security Act (“TDPSA”).
KVUE reached out to the company for a statement but did not hear back.
Character AI put the following message on its website about safety changes it plans to make:
New Features
Moving forward, we will be rolling out a number of new safety and product features that strengthen the security of our platform without compromising the entertaining and engaging experience users have come to expect from Character.AI. These include:
- Changes to our models for minors (under the age of 18) that are designed to reduce the likelihood of encountering sensitive or suggestive content.
- Improved detection, response, and intervention related to user inputs that violate our Terms or Community Guidelines.
- A revised disclaimer on every chat to remind users that the AI is not a real person.
- Notification when a user has spent an hour-long session on the platform with additional user flexibility in progress.