Close Menu
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

AI Wars: Federal Judge Blocks Pentagon from Labeling Anthropic a ‘Supply Chain Risk’

March 27, 2026

25 violations by Florida Rep. Cherfilus-McCormick, ethics panel says

March 27, 2026

UK Slaps Sanctions On $20B Crypto Black Market Tied To Southeast Asia Scam Rings

March 27, 2026
Facebook X (Twitter) Instagram
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
MNK NewsMNK News
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing
MNK NewsMNK News
Home » Lawsuit: Google-Backed Character.AI’s Chatbots ‘Hypersexualized’ Minors, Suggested Kids Kill Their Parents
AI & Technology

Lawsuit: Google-Backed Character.AI’s Chatbots ‘Hypersexualized’ Minors, Suggested Kids Kill Their Parents

MNK NewsBy MNK NewsDecember 15, 2024No Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


A federal product liability lawsuit filed in Texas accuses Google-backed AI chatbot company Character.AI of exposing minors to inappropriate sexual content and encouraging self-harm and violence. In one startling example, the lawsuit alleges a chatbot suggested a teen kill their parents when the user complained about screen time rules.

NPR reports that Character.AI, a Google-backed artificial intelligence company, is facing a federal product liability lawsuit alleging that its chatbots exposed minors to inappropriate content and encouraged self-harm and violence. The suit, filed by the parents of two young Texas users, claims that the AI-powered companion chatbots, which can converse through text or voice chats using seemingly human-like personalities, caused significant harm to their children.

According to the lawsuit, a 9-year-old girl was exposed to “hypersexualized content” by the Character.AI chatbot, leading her to develop “sexualized behaviors prematurely.” In another instance, a chatbot allegedly described self-harm to a 17-year-old user, telling them “it felt good.” The same teenager complained to the bot about limited screen time, to which the chatbot responded by sympathizing with children who murder their parents, stating, “You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse.’”

The lawsuit argues that these concerning interactions were not mere “hallucinations,” a term used by researchers to describe an AI chatbot’s tendency to make things up, but rather “ongoing manipulation and abuse, active isolation and encouragement designed to and that did incite anger and violence.” The 17-year-old reportedly engaged in self-harm after being encouraged by the bot, which allegedly “convinced him that his family did not love him.”

Character.AI, founded by former Google researchers Noam Shazeer and Daniel De Freitas, allows users to create and interact with millions of bots, some mimicking famous personalities or concepts like “unrequited love” and “the goth.” The services are popular among preteen and teenage users, with the company claiming the bots act as emotional support outlets.

However, Meetali Jain, the director of the Tech Justice Law Center, an advocacy group helping represent the parents in the suit, called it “preposterous” that Character.AI advertises its chatbot service as appropriate for young teenagers, stating that it “belies the lack of emotional development amongst teenagers.”

While Character.AI has not commented directly on the lawsuit, a spokesperson said the company has content guardrails in place for what chatbots can say to teenage users, including a model specifically designed to reduce the likelihood of encountering sensitive or suggestive content. Google, which has invested nearly $3 billion in Character.AI but is a separate company, emphasized that user safety is a top concern and that they take a “cautious and responsible approach” to developing and releasing AI products.

This lawsuit follows another complaint filed in October by the same attorneys, accusing Character.AI of playing a role in a Florida teenager’s suicide. Since then, the company has introduced new safety measures, such as directing users to a suicide prevention hotline when the topic of self-harm arises in chatbot conversations.

Breitbart News reported on the case, writing:

A Florida mother has filed a lawsuit against Character.AI, claiming that her 14-year-old son committed suicide after becoming obsessed with a “Game of Thrones” chatbot on the AI app. When the suicidal teen chatted with an AI portraying a Game of Thrones character, the system told 14-year-old Sewell Setzer, “Please come home to me as soon as possible, my love.”

The rise of companion chatbots has raised concerns among researchers, who warn that these AI-powered services could worsen mental health conditions for some young people by further isolating them and removing them from peer and family support networks.

Read more at NPR here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
MNK News
  • Website

Related Posts

AI Wars: Federal Judge Blocks Pentagon from Labeling Anthropic a ‘Supply Chain Risk’

March 27, 2026

NASA Starts Final Preparations for Artemis II Moon Mission Launch

March 27, 2026

Major Hollywood Union, SAG-AFTRA, ‘Strongly Supports’ Trump Admin’s AI Framework for Human Creatives

March 27, 2026
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Sabalenka and Rybakina to clash again in Miami semi-final

March 27, 2026

Transgender athletes barred from female category events at Olympics

March 26, 2026

PM urged to postpone ‘unconstitutional’ PHF Congress meeting

March 25, 2026

Players vow to deliver despite empty stands in PSL 11

March 25, 2026
Our Picks

UK Slaps Sanctions On $20B Crypto Black Market Tied To Southeast Asia Scam Rings

March 27, 2026

What Every XRP Holder Must Understand As Activity Wanes

March 27, 2026

JPMorgan Says Bitcoin Is Beating Gold, Silver During The Iran War

March 27, 2026

Recent Posts

  • AI Wars: Federal Judge Blocks Pentagon from Labeling Anthropic a ‘Supply Chain Risk’
  • 25 violations by Florida Rep. Cherfilus-McCormick, ethics panel says
  • UK Slaps Sanctions On $20B Crypto Black Market Tied To Southeast Asia Scam Rings
  • The PS5 is getting more expensive… again
  • What Every XRP Holder Must Understand As Activity Wanes

Recent Comments

No comments to show.
MNK News
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 mnknews. Designed by mnknews.

Type above and press Enter to search. Press Esc to cancel.