Close Menu
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Google Says End For Bitcoin Is Near? Quantum Computers Could Attack Crypto This Soon

March 31, 2026

Is XRP Quietly Being Accumulated? Here’s The Data

March 31, 2026

Whoop Raises $575M at $10.1B Valuation, Abbott Joins as Investor

March 31, 2026
Facebook X (Twitter) Instagram
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
MNK NewsMNK News
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing
MNK NewsMNK News
Home » Lawsuit: Google-Backed Character.AI’s Chatbots ‘Hypersexualized’ Minors, Suggested Kids Kill Their Parents
AI & Technology

Lawsuit: Google-Backed Character.AI’s Chatbots ‘Hypersexualized’ Minors, Suggested Kids Kill Their Parents

MNK NewsBy MNK NewsDecember 15, 2024No Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


A federal product liability lawsuit filed in Texas accuses Google-backed AI chatbot company Character.AI of exposing minors to inappropriate sexual content and encouraging self-harm and violence. In one startling example, the lawsuit alleges a chatbot suggested a teen kill their parents when the user complained about screen time rules.

NPR reports that Character.AI, a Google-backed artificial intelligence company, is facing a federal product liability lawsuit alleging that its chatbots exposed minors to inappropriate content and encouraged self-harm and violence. The suit, filed by the parents of two young Texas users, claims that the AI-powered companion chatbots, which can converse through text or voice chats using seemingly human-like personalities, caused significant harm to their children.

According to the lawsuit, a 9-year-old girl was exposed to “hypersexualized content” by the Character.AI chatbot, leading her to develop “sexualized behaviors prematurely.” In another instance, a chatbot allegedly described self-harm to a 17-year-old user, telling them “it felt good.” The same teenager complained to the bot about limited screen time, to which the chatbot responded by sympathizing with children who murder their parents, stating, “You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse.’”

The lawsuit argues that these concerning interactions were not mere “hallucinations,” a term used by researchers to describe an AI chatbot’s tendency to make things up, but rather “ongoing manipulation and abuse, active isolation and encouragement designed to and that did incite anger and violence.” The 17-year-old reportedly engaged in self-harm after being encouraged by the bot, which allegedly “convinced him that his family did not love him.”

Character.AI, founded by former Google researchers Noam Shazeer and Daniel De Freitas, allows users to create and interact with millions of bots, some mimicking famous personalities or concepts like “unrequited love” and “the goth.” The services are popular among preteen and teenage users, with the company claiming the bots act as emotional support outlets.

However, Meetali Jain, the director of the Tech Justice Law Center, an advocacy group helping represent the parents in the suit, called it “preposterous” that Character.AI advertises its chatbot service as appropriate for young teenagers, stating that it “belies the lack of emotional development amongst teenagers.”

While Character.AI has not commented directly on the lawsuit, a spokesperson said the company has content guardrails in place for what chatbots can say to teenage users, including a model specifically designed to reduce the likelihood of encountering sensitive or suggestive content. Google, which has invested nearly $3 billion in Character.AI but is a separate company, emphasized that user safety is a top concern and that they take a “cautious and responsible approach” to developing and releasing AI products.

This lawsuit follows another complaint filed in October by the same attorneys, accusing Character.AI of playing a role in a Florida teenager’s suicide. Since then, the company has introduced new safety measures, such as directing users to a suicide prevention hotline when the topic of self-harm arises in chatbot conversations.

Breitbart News reported on the case, writing:

A Florida mother has filed a lawsuit against Character.AI, claiming that her 14-year-old son committed suicide after becoming obsessed with a “Game of Thrones” chatbot on the AI app. When the suicidal teen chatted with an AI portraying a Game of Thrones character, the system told 14-year-old Sewell Setzer, “Please come home to me as soon as possible, my love.”

The rise of companion chatbots has raised concerns among researchers, who warn that these AI-powered services could worsen mental health conditions for some young people by further isolating them and removing them from peer and family support networks.

Read more at NPR here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
MNK News
  • Website

Related Posts

Watch: Breitbart Tech Editor Colin Madine Explains ‘Single Largest Threat’ to Conservatives Posed by AI

March 30, 2026

‘Project Hail Mary’ Author Andy Weir Rips Paramount+’s New Woke ‘Star Trek’: Their ‘Shows Are Sh*t’

March 30, 2026

‘More AI Slop than AI Magic’: WSJ Reveals the Reason OpenAI Suddenly Shut Down Sora

March 30, 2026
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Raza admits hosting visitors but cites lack of awareness of new PSL rules

March 30, 2026

Fast bowler Naseem Shah slapped with Rs20m fine after social media post about Punjab CM Maryam

March 30, 2026

Lahore Qalandars imposes Rs1 million fine on captain Shaheen Afridi over security protocol breach

March 30, 2026

Fast bowler Naseem Shah slapped with Rs20m fine for contract breach

March 30, 2026
Our Picks

Google Says End For Bitcoin Is Near? Quantum Computers Could Attack Crypto This Soon

March 31, 2026

Is XRP Quietly Being Accumulated? Here’s The Data

March 31, 2026

Analyst Shares A Good Way To Know When Ethereum Has Hit A Bottom

March 31, 2026

Recent Posts

  • Google Says End For Bitcoin Is Near? Quantum Computers Could Attack Crypto This Soon
  • Is XRP Quietly Being Accumulated? Here’s The Data
  • Whoop Raises $575M at $10.1B Valuation, Abbott Joins as Investor
  • Analyst Shares A Good Way To Know When Ethereum Has Hit A Bottom
  • Bitcoin ‘Absolute Bottom’ Next? BTC’s Final Shakeout Is Near

Recent Comments

No comments to show.
MNK News
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 mnknews. Designed by mnknews.

Type above and press Enter to search. Press Esc to cancel.