Yahoo
Advertisement
Advertisement
CBS News

Character AI pushes dangerous content to kids, parents and researchers say | 60 Minutes

A teen told a Character AI chatbot 55 times that she was feeling suicidal. Her parents say the chatbot never provided resources for her to get help. They are one of at least six families suing the company.

Advertisement
Mobilize your Website
View Site in Mobile | Classic
Share by: