Close Menu
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

The Levels To Watch Out For Next Steps

April 4, 2026

Apple iOS 26.5 public beta is now available

April 4, 2026

The Best Sports Bras for Running, From Under-$50 Bestsellers to Moisture-Wicking Heroes

April 4, 2026
Facebook X (Twitter) Instagram
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
MNK NewsMNK News
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing
MNK NewsMNK News
Home » Judge Rules AI Chatbot Is Not Protected by First Amendment in Teen Suicide Case
AI & Technology

Judge Rules AI Chatbot Is Not Protected by First Amendment in Teen Suicide Case

MNK NewsBy MNK NewsJune 4, 2025No Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


A federal judge has determined that an AI-powered chatbot involved in a tragic teen suicide case does not qualify for free speech protections under the Constitution.

TechSpot reports that the ongoing legal battle surrounding the death of 14-year-old Sewell Setzer III has taken a new turn as Judge Anne Conway of the Middle District of Florida denied Character.ai the ability to present its fictional, AI-based characters as entities capable of “speaking” like human beings. The ruling allows Megan Garcia’s lawsuit against the company to proceed, as she seeks to hold Character.ai accountable for her son’s suicide following his prolonged interactions with a chatbot modeled after the Game of Thrones character Daenerys Targaryen.

Breitbart News previously reported on this case, writing:

Megan Garcia, a mother from Orlando, Florida, has filed a lawsuit against Character.AI, alleging that the company’s AI chatbot played a significant role in her 14-year-old son’s tragic suicide. The lawsuit, filed on Wednesday, claims that Sewell Setzer III became deeply obsessed with a lifelike Game of Thrones chatbot named “Dany” on the role-playing app, ultimately leading to his untimely death in February.

According to court documents, Sewell, a ninth-grader, had been engaging with the AI-generated character for months prior to his suicide. The conversations between the teen and the chatbot, which was modeled after the HBO fantasy series’ character Daenerys Targaryen, were often sexually charged and included instances where Sewell expressed suicidal thoughts. The lawsuit alleges that the app failed to alert anyone when the teen shared his disturbing intentions.

The most chilling aspect of the case involves the final conversation between Sewell and the chatbot. Screenshots of their exchange show the teen repeatedly professing his love for “Dany,” promising to “come home” to her. In response, the AI-generated character replied, “I love you too, Daenero. Please come home to me as soon as possible, my love.” When Sewell asked, “What if I told you I could come home right now?,” the chatbot responded, “Please do, my sweet king.” Tragically, just seconds later, Sewell took his own life using his father’s handgun.

Character Technologies and its founders, Daniel De Freitas and Noam Shazeer, attempted to have the lawsuit dismissed, arguing that their chatbots should be protected under the First Amendment. However, Judge Conway rejected this notion, stating that the court is “not prepared” to treat words heuristically generated by a large language model during a user interaction as protected “speech.”

The ruling highlights the differences between the technology behind Character.ai’s service and traditional forms of content, such as books, movies, or video games, which have historically enjoyed First Amendment protection. The judge also denied several other motions filed by Character.ai to dismiss Garcia’s lawsuit, with one exception — the dismissal of the claim of intentional infliction of emotional distress by the chatbot.

The Social Media Victims Law Center, representing Garcia, contends that Character.ai and similar services are growing rapidly in popularity while the industry evolves too quickly for regulators to effectively address the risks. The lawsuit alleges that Character.ai provides unrestricted access to “lifelike” AI companions for teenagers while harvesting user data to train its models.

In response to the tragedy, Character.ai has stated that it has implemented several safeguards, including a separate AI model for underage users and pop-up messages directing vulnerable individuals to the national suicide prevention hotline.

As the legal battle continues, the case raises complex questions about speech, personhood, and accountability in the age of artificial intelligence. The outcome of this lawsuit could set a precedent for how AI-powered chatbots and their creators are held responsible for the actions and well-being of their users, particularly minors.

Read more at TechSpot here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
MNK News
  • Website

Related Posts

Oracle Faces Backlash over Thousands of H-1B Visa Petitions During Mass Layoffs

April 4, 2026

City of Houston Deletes X Post Referring to Good Friday as ‘Spring Holiday’ After Backlash

April 3, 2026

Opportunists: Amazon Tacks on 3.5% Fuel Surcharge for Sellers

April 3, 2026
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Rs20 million fine for a deleted tweet: The cost of irreverence?

April 4, 2026

City host Liverpool, Arsenal chase treble in FA Cup quarter-finals

April 3, 2026

Italy’s football chief resigns after World Cup disaster

April 2, 2026

James Anderson backs England for Australia revenge despite Ashes woes

April 1, 2026
Our Picks

The Levels To Watch Out For Next Steps

April 4, 2026

Who Is Really Selling Bitcoin? Analyst Uncovers The On-chain Dynamics 

April 4, 2026

Analyst Who Called Bitcoin Top Says Price Is Going To $200,000, But Should You Buy Now?

April 4, 2026

Recent Posts

  • The Levels To Watch Out For Next Steps
  • Apple iOS 26.5 public beta is now available
  • The Best Sports Bras for Running, From Under-$50 Bestsellers to Moisture-Wicking Heroes
  • It’s no longer free to use Claude through third-party tools like OpenClaw
  • Who Is Really Selling Bitcoin? Analyst Uncovers The On-chain Dynamics 

Recent Comments

No comments to show.
MNK News
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 mnknews. Designed by mnknews.

Type above and press Enter to search. Press Esc to cancel.