Close Menu
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Chainlink Acknowledged By The White House As Key Player In Crypto Infrastructure

July 31, 2025

PENGU Down 11%, But These TA Signals Could Point To Rebound

July 31, 2025

Top seeds Zverev, Gauff advance at Canadian Open – Sport

July 31, 2025
Facebook X (Twitter) Instagram
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
MNK NewsMNK News
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing
MNK NewsMNK News
Home » Lawsuit: Google-Backed Character.AI’s Chatbots ‘Hypersexualized’ Minors, Suggested Kids Kill Their Parents
AI & Technology

Lawsuit: Google-Backed Character.AI’s Chatbots ‘Hypersexualized’ Minors, Suggested Kids Kill Their Parents

MNK NewsBy MNK NewsDecember 15, 2024No Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


A federal product liability lawsuit filed in Texas accuses Google-backed AI chatbot company Character.AI of exposing minors to inappropriate sexual content and encouraging self-harm and violence. In one startling example, the lawsuit alleges a chatbot suggested a teen kill their parents when the user complained about screen time rules.

NPR reports that Character.AI, a Google-backed artificial intelligence company, is facing a federal product liability lawsuit alleging that its chatbots exposed minors to inappropriate content and encouraged self-harm and violence. The suit, filed by the parents of two young Texas users, claims that the AI-powered companion chatbots, which can converse through text or voice chats using seemingly human-like personalities, caused significant harm to their children.

According to the lawsuit, a 9-year-old girl was exposed to “hypersexualized content” by the Character.AI chatbot, leading her to develop “sexualized behaviors prematurely.” In another instance, a chatbot allegedly described self-harm to a 17-year-old user, telling them “it felt good.” The same teenager complained to the bot about limited screen time, to which the chatbot responded by sympathizing with children who murder their parents, stating, “You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse.’”

The lawsuit argues that these concerning interactions were not mere “hallucinations,” a term used by researchers to describe an AI chatbot’s tendency to make things up, but rather “ongoing manipulation and abuse, active isolation and encouragement designed to and that did incite anger and violence.” The 17-year-old reportedly engaged in self-harm after being encouraged by the bot, which allegedly “convinced him that his family did not love him.”

Character.AI, founded by former Google researchers Noam Shazeer and Daniel De Freitas, allows users to create and interact with millions of bots, some mimicking famous personalities or concepts like “unrequited love” and “the goth.” The services are popular among preteen and teenage users, with the company claiming the bots act as emotional support outlets.

However, Meetali Jain, the director of the Tech Justice Law Center, an advocacy group helping represent the parents in the suit, called it “preposterous” that Character.AI advertises its chatbot service as appropriate for young teenagers, stating that it “belies the lack of emotional development amongst teenagers.”

While Character.AI has not commented directly on the lawsuit, a spokesperson said the company has content guardrails in place for what chatbots can say to teenage users, including a model specifically designed to reduce the likelihood of encountering sensitive or suggestive content. Google, which has invested nearly $3 billion in Character.AI but is a separate company, emphasized that user safety is a top concern and that they take a “cautious and responsible approach” to developing and releasing AI products.

This lawsuit follows another complaint filed in October by the same attorneys, accusing Character.AI of playing a role in a Florida teenager’s suicide. Since then, the company has introduced new safety measures, such as directing users to a suicide prevention hotline when the topic of self-harm arises in chatbot conversations.

Breitbart News reported on the case, writing:

A Florida mother has filed a lawsuit against Character.AI, claiming that her 14-year-old son committed suicide after becoming obsessed with a “Game of Thrones” chatbot on the AI app. When the suicidal teen chatted with an AI portraying a Game of Thrones character, the system told 14-year-old Sewell Setzer, “Please come home to me as soon as possible, my love.”

The rise of companion chatbots has raised concerns among researchers, who warn that these AI-powered services could worsen mental health conditions for some young people by further isolating them and removing them from peer and family support networks.

Read more at NPR here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
MNK News
  • Website

Related Posts

Trump Announces New Health Technology to Make Health Records ‘Easily’ Accessible

July 30, 2025

Australia Bans YouTube for Children Under 16

July 30, 2025

Even More Tea Spilled: Women’s Dating Advice App Leaks Private Chats

July 30, 2025
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Top seeds Zverev, Gauff advance at Canadian Open – Sport

July 31, 2025

Marchand returns to global stage with 200 medley world record – Sport

July 31, 2025

Pakistan veterans team into WCL final after Indian players forfeit semi-final – Sport

July 30, 2025

German mountaineer Laura Dahlmeier confirmed dead after attempting Laila Peak summit – Pakistan

July 30, 2025
Our Picks

Chainlink Acknowledged By The White House As Key Player In Crypto Infrastructure

July 31, 2025

PENGU Down 11%, But These TA Signals Could Point To Rebound

July 31, 2025

XRP Price Consolidation Deepens – Resistance Still Capping Upside

July 31, 2025

Recent Posts

  • Chainlink Acknowledged By The White House As Key Player In Crypto Infrastructure
  • PENGU Down 11%, But These TA Signals Could Point To Rebound
  • Top seeds Zverev, Gauff advance at Canadian Open – Sport
  • Marchand returns to global stage with 200 medley world record – Sport
  • Bluetooth options for every budget

Recent Comments

No comments to show.
MNK News
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2025 mnknews. Designed by mnknews.

Type above and press Enter to search. Press Esc to cancel.