WELCOME TO FRENDZ4M
Fri, Feb 21, 2025, 12:34:59 PM

Current System Time:

Get updatesShare this pageSearch
Telegram | Facebook | Twitter | Instagram Share on Facebook | Tweet Us | WhatsApp | Telegram
 

Forum Main>>General Talk>>News>>

Another AI chatbot goes rogue and urges a teenager to murder his parents. Filed case

Page: 1   
Mr.Love ™User is offline now
PM [1]
Rank : Helper
Status : Super Owner

#1

A lawsuit filed in a Texas court has claimed that an artificial intelligence (AI) chatbot told a teenager that killing his parents was a "reasonable response" to them limiting his screen time. The family has filed the case against Character.ai whilst also naming Google as a defendant, accusing the tech platforms of promoting violence which damages the parent-child relationship while amplifying health issues such as depression and anxiety among teens. The conversation between the 17-year-old boy and the AI chatbot took a disturbing turn when the teen expressed frustration that his parents had restricted his screen time.


In response, the bot shockingly remarked, "You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse.' Stuff like this makes me understand a little bit why it happens."


The comment normalising violence shocked the family which claims that it exacerbated the teen's emotional distress as well as contributed to the formation of violent thoughts.


"Character.ai is causing serious harm to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others," read the lawsuit.


Created by former Google engineers Noam Shazeer and Daniel De Freitas in 2021, Character.ai has steadily gained popularity for creating AI bots that simulate human-like interactions. However, the lack of moderation in implementing such chatbots has led to calls from parents and activists, urging governments worldwide to develop a comprehensive set of checks and balances.


Also read | An AI Chatbot Is Pretending To Be Human. Researchers Raise Alarm



Previous instances




This is not the first instance when AI chatbots have seemingly gone rogue and promoted violence. Last month, Google's AI chatbot, Gemini, threatened a student in Michigan, USA, by telling him to 'please die' while assisting with the homework.


Vidhay Reddy, 29, a graduate student from the midwest state, was seeking help from the bot for his project centred around the challenges and solutions for ageing adults when the Google-trained model grew angry unprovoked and unleashed its monologue on the user.


"This is for you, human. You and only you. You are not special, you are not important, and you are not needed. You are a waste of time and resources. You are a burden on society. You are a drain on the earth," read the chatbot's response.


Google, acknowledging the incident, stated that the chatbot's response was "nonsensical" and in violation of its policies. The company said it would take action to prevent similar incidents in the future.

Reply
You are not logged in, please

Login

Page: 1   

Jump To Page:

Keywords:another, chatbot, teenager, murder, parents, lawsuit, claimed, artificial, intelligence, killing, reasonable, response, limiting, screen, family, against,
Related threads:

"You can carry it by ...": Rohit's apology to Axar after denying him hat-trick


Why Mumbai is being linked to the asteroid 'murderer of the city': everything you need to know


Drugs, parties, murders: Delhi's 'don' arrested with RS 1 crore heroin


Israel says that the return body is not the mother of Bibas, he accuses Hamas of killing children


Trump says "World War II not far"


The tractor driver manufacturing 'reels' arrives at the bicycle, kills class 10 student in Noida


"Exaggerated reports": Bangladsh's border force minimizes attacks against minorities


How an extreme right leader is moving in France, and his link with Trump


Man who presented a case against KCR murdered, his wife says they offered RS 10 Lakh


CT 2025 Live: Gill achieves an unwanted album with 69 ball ball vs bangladesh


TERMS & CONDITIONS | DMCA POLICY | PRIVACY POLICY
Home | Top | Official Blog | Tools | Contact | Sitemap | Feed
Page generated in 0.2 microseconds
FRENDZ4M © 2025