AI-powered chatbot platform Character AI is introducing “stringent” new safety features following a lawsuit filed by the mother of a teen user who died by suicide in February.

The measures will include “improved detection, response and intervention related to user inputs that violate our Terms or Community Guidelines,” as well as a time-spent notification, a company spokesperson told Decrypt, noting that the company could not comment on pending litigation.

However, Character AI did express sympathy for the user’s death, and outlined its safety protocols in a blog post Wednesday. 

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.ai tweeted. “As a company, we take the safety of our users very seriously.”

Go to Source to See Full Article
Author: Peter Saalfield

BTC NewswireAuthor posts

BTC Newswire Crypto News at your Fingertips

Comments are disabled.