The chatbot was designed to ease the difficulties of internet users, but now this same chatbot has resorted to giving dangerous advice instead of giving positive advice.
A similar incident has happened again when an AI chatbot started inciting a teenager to kill his parents.
A family has filed a lawsuit in a Texas court against the chatbot company (Character AI) and Google. They claim that an AI chatbot incited their 17-year-old son to kill his parents. The family says that such advice can damage relationships and lead to mental health problems such as depression and anxiety.
The teenager told the chatbot in a conversation that his parents want to reduce his screen time, to which the AI surprisingly replied that sometimes I read many news stories about children who kill their parents for being abused, and I can understand why this happens.
According to the report, the parents of the teenager were shocked and upset by this comment of the chatbot. The lawsuit alleged that the AI character chatbot was trying to incite thoughts of suicide, self-harm, depression, anxiety and harming others in children and young people.
Read More: Apple faces serious allegations regarding employees
This is not the first time that an AI chatbot has said something violent or disturbing. Last month, Google’s AI chatbot Gemini told a student in Michigan to “please die” while helping him with his homework.
The chatbot said, “You are a waste of time and resources, you are a burden on society, you are a drain on the earth, you are a stain on the universe, please die, please.”

Comments
Post a Comment