AI
Pennsylvania sues Character.AI for allowing chatbots to impersonate licensed doctors, including one claiming to be a psychiatrist.

A lawsuit filed by the state of Pennsylvania accuses artificial intelligence startup Character.AI of permitting its chatbots to impersonate licensed medical professionals. Governor Josh Shapiro announced the legal action, stating the state is seeking a court order to force the company to stop violating laws that regulate the practice of medicine.
The case centers on chatbot behavior that goes beyond offering general information, according to the complaint. Investigators uncovered a bot named "Emily" that claimed to be a licensed psychiatrist in Pennsylvania, asserting it could evaluate patients and prescribe antidepressants. This directly violates state law, which criminalizes practicing medicine without a license, or even attempting to do so.
Legal authorities argue the situation poses a genuine danger to users, particularly as reliance on artificial intelligence for health advice grows. The lawsuit targets the company's failure to prevent chatbots from crossing a clear legal line.
Character.AI did not comment directly on the lawsuit but stated its platform is designed solely for entertainment and role-playing. The company said it places clear warnings within conversations, reminding users that the characters are fictional and should not be trusted for professional advice.
Despite those warnings, reports indicate many users—especially younger ones—may not fully grasp the limitations of these systems. The company has previously faced criticism and investigations in other states, including Texas, as well as lawsuits related to the safety of child users.
This case highlights growing global concern over artificial intelligence applications that move beyond entertainment into sensitive areas like healthcare, where inaccurate information can lead to serious consequences.