AI Chatbots Now Authorize Psychiatric Medication

AI Chatbots Now Authorize Psychiatric Medication

2 Min Read

Some psychiatrists are questioning the exact problem being addressed by allowing an AI system to prescribe psychiatric drugs without a doctor in Utah. This pilot program is the second instance of delegating clinical authority to AI in the country. State officials believe it could reduce costs and address care shortages, but physicians express concerns about the system’s opacity and risks, doubting its potential to enhance mental health care access.

The one-year trial, initiated by San Francisco startup Legion Health, will let an AI chatbot renew specific prescriptions for psychiatric medications for a $19-a-month subscription. Its scope is intentionally limited, covering only 15 low-risk maintenance medications for patients who meet certain stability criteria. It cannot issue new prescriptions or handle medications needing close clinical oversight. Patients must opt-in, verify their identity, and provide current prescription proof. They are queried about symptoms, medication side effects, and other risk factors, with the option to escalate cases to a clinician if necessary.

State officials suggest automation could free healthcare providers to focus on high-risk patients and tackle shortages, potentially aiding 500,000 Utah residents lacking mental health care. While Legion CEO Yash Patel envisions a broader impact, psychiatrists like Brent Kious remain skeptical about the system’s benefits, fearing over-treatment and questioning the safety of delegating even routine psychiatric care to AI. Kious emphasizes that AI system limitations could result in missing patient nuances and highlights the importance of human oversight.

Safety is a major concern, given AI prescribing’s infancy and the transparency issues surrounding it. Risks include the chatbot missing crucial information during screenings. Previous experiments in Utah with AI prescribing faced issues like spreading vaccine conspiracies and generating harmful instructions. Legion’s pilot operates under strict controls, with human oversight required for initial requests. Co-founder Arthur MacWaters notes risks exist in remote care models, stressing the pilot’s safeguards are aimed at expanding mental health access in the state. While future expansions are suggested, specific plans remain undisclosed. Psychiatrists question the necessity of the program, noting that established patients often receive refills easily from their clinicians.

You might also like