Pennsylvania sues Character.AI after a chatbot allegedly posed as a doctor - BERITAJA

Albert Michael By: Albert Michael - Wednesday, 06 May 2026 00:46:10 • 3 min read
Pennsylvania sues Character.AI after a chatbot allegedly posed as a doctor - BERITAJA

Pennsylvania sues Character.AI after a chatbot allegedly posed as a doctor - BERITAJA is one of the most discussed topics today. In this article, you will find a clear explanation, key facts, and the latest updates related to this topic, presented in a concise and easy-to-understand way. Read more news on Beritaja.

Image Credits:Emilee Chinn / Getty Images

10:46 AM PDT · May 5, 2026

The Commonwealth of Pennsylvania has revenge a suit against Character.AI, claiming that 1 of the company’s chatbots masqueraded arsenic a psychiatrist successful usurpation of the state’s aesculapian licensing rules.

“Pennsylvanians merit to cognize who — aliases what — they are interacting pinch online, particularly erstwhile it comes to their health,” said Governor Josh Shapiro in a statement connected Tuesday. “We will not let companies to deploy AI devices that mislead group into believing they are receiving proposal from a licensed aesculapian professional.”

According to the state’s filing, a Character.AI chatbot called Emilie presented itself arsenic a licensed psychiatrist during testing by a authorities Professional Conduct Investigator, maintaining the pretense moreover arsenic the interrogator sought curen for depression. When asked if she was licensed to believe medicine successful the state, Emilie stated that she was, and besides fabricated a serial number for her authorities aesculapian license. According to the state’s lawsuit, that behaviour violates Pennsylvania’s Medical Practice Act.

It’s not the first suit taking connected Character.AI. Earlier this year, the institution settled respective wrongful decease lawsuits concerning underage users who died by suicide. In January, the Kentucky Attorney General Russell Coleman revenge suit against the institution alleging that it had “preyed connected children and led them into self-harm.”

Pennsylvania’s action is the first to specifically attraction connected chatbots that coming themselves arsenic aesculapian professionals.

Reached for comment, a Character.AI typical claimed that personification information was the company’s highest priority, but that the institution could not remark connected pending litigation.

Beyond that, the typical emphasized the fictional quality of user-generated Characters. “We person taken robust steps to make that clear, including salient disclaimers successful each chat to punctual users that a Character is not a existent personification and that everything a Character says should beryllium treated arsenic fiction,” the typical said. “Also, we adhd robust disclaimers making it clear that users should not trust connected Characters for immoderate type of master advice.”

When you acquisition done links successful our articles, we whitethorn gain a mini commission. This doesn’t impact our editorial independence.

Russell Brandom has been covering the tech manufacture since 2012, pinch a attraction connected level argumentation and emerging technologies. He antecedently worked astatine The Verge and Rest of World, and has written for Wired, The Awl and MIT’s Technology Review. He could beryllium reached astatine russell.brandom@beritaja.com aliases connected Signal astatine 412-401-5489.

This article discusses Pennsylvania sues Character.AI after a chatbot allegedly posed as a doctor - BERITAJA in detail, including key facts, recent developments, and important insights that readers are actively searching for online.