Home » The FDA’s drug-approving chatbot makes false claims, insiders say

The FDA’s drug-approving chatbot makes false claims, insiders say

by Adrian Russell


The Food and Drug Administration’s new AI tool — touted by Secretary of Health and Human Services Robert F. Kennedy, Jr. as a revolutionary solution for shortening drug approvals — is initially causing more hallucinations than solutions.

Known as Elsa, the AI chatbot was introduced to help FDA employees with daily tasks like meeting notes and emails, while simultaneously supporting quicker drug and device approval turnaround times by sorting through important application data. But, according to FDA insiders who spoke to CNN under anonymity, the chatbot is rife with hallucinations, often fabricating medical studies or misinterpreting important data. The tool has been sidelined by staffers, with sources saying it can’t be used in reviews and does not have access to crucial internal documents employees were promised.

“It hallucinates confidently,” one FDA employee told CNN. According to the sources, the tool often provides incorrect answers on the FDA’s research areas, drug labels, and can’t link to third-party citations from external medical journals.

Despite initial claims that the tool was already integrated into the clinical review protocol, FDA Commissioner Marty Makary told CNN that the tool was only being used for “organizational duties” and was not required of employees. The FDA’s head of AI admitted to the publication that the tool was at risk of hallucinating, carrying the same risk as other LLMs. Both said they weren’t surprised it made mistakes, and said further testing and training was needed.

Mashable Light Speed

But not all LLM’s have the job of approving life-saving medicine.

The agency announced the new agentic tool in June, with Vinay Prasad, director of the FDA’s Center for Biologics Evaluation and Research (CBER), and Makary writing that AI innovation was a leading priority for the agency in an accompanying Journal of the American Medical Association (JAMA) article. The tool, which examines device and drug applications, was pitched as a solution for lengthy and oft-criticized drug approval periods, following the FDA’s launch of an AI-assisted scientific review pilot.

The Trump administration has rallied government agencies behind an accelerated, “America-first” AI agenda, including recent federal guidance to establish FDA-backed AI Centers of Excellence for testing and deploying new AI tools, announced in the government’s newly unveiled AI Action Plan. Many are worried that the aggressive push and deregulation efforts eschew necessary oversight of the new tech.

“Many of America’s most critical sectors, such as healthcare, are especially slow to adopt due to a variety of factors, including distrust or lack of understanding of the technology, a complex regulatory landscape, and a lack of clear governance and risk mitigation standards,” the action plan reads. “A coordinated Federal effort would be beneficial in establishing a dynamic, ‘try-first’ culture for AI across American industry.”



Source link

You may also like

© 2025 cryptopulsedaily.xyz. All rights reserved