AI Chatbots As Mental-Health Therapists? No Way, NJ Lawmakers Say
By: Cora LeCates
From personal assistant to librarian and tutor to graphic designer, generative AI can play a host of roles. However, there is one part that New Jersey lawmakers seek to exclude from artificial intelligence systems: shrink.
On June 16, the Assembly Science, Innovation and Technology Committee unanimously released a bill prohibiting the advertisement of AI systems as licensed mental health professionals.
Apps and websites, including Abby.gg, Wysa and Replika say users can tell their troubles to artificial intelligence programs to complement or even replace traditional talk therapy. The programs are typically advertised on social media, and often with endorsements from influencers – a tactic that critics say is driving misconceptions about the technology’s capabilities.
“Abby uses advanced natural language processing to understand and respond to your emotions, offer tailored insights and guide you through difficult situations,” its website says. The system offers “therapy in your pocket,” it says, for troubles including trauma, depression, bullying, substance abuse and loneliness. Experience round-the-clock support and guidance with a 24/7 AI therapist, always at your fingertips to help you navigate life’s challenges.
In a 2024 interview, University of California, Berkeley School of Public Health Professor Jodi Halpern warned of the dangers of using AI in mental health, citing patient vulnerability and a lack of government regulation for the “treatment.”
“People with mental health needs are often reluctant to seek care, and making an actual human connection can help,” Halpern said.
AI increasingly is being pressed for healthcare purposes, including diagnostics, robotic surgery, predictive modeling and drug research. Its ads for mental help come as a session with a licensed therapist can cost $100 or more – and that’s if appointments are open. According to a 2023 American Psychological Association survey, more than half of psychologists lack openings for new patients.
In a 2024 letter to the Federal Trade Commission 2024, the association wrote of AI “therapy”: “These are not FDA-cleared digital health tools, are not subject to HIPAA compliance, or required to demonstrate any evidence base supporting their efficacy or safety.”
On June 11th, a coalition led by the Consumer Federation of America filed a similar complaint to the Federal Trade Commission, plus the attorneys general and mental health licensing boards of all 50 states.
In May, Utah became the first state to enact a law creating advertising restrictions and privacy protections for mental health chatbots. Legislation similar to New Jersey’s is under consideration in several other states, including New York, Illinois and Nevada.
The New Jersey Assembly panel passed the legislation without comment. It will now advance to the full Assembly for consideration.