close
close

Pasteleria-edelweiss

Real-time news, timeless knowledge

An AI-powered transcription tool used in hospitals invents things no one has ever said before, researchers say
bigrus

An AI-powered transcription tool used in hospitals invents things no one has ever said before, researchers say

In one example they uncovered, a speaker said: “That kid, I’m not exactly sure, was going to take the umbrella.”

But the transcription software added: “He took a big piece of a cross, a teeny tiny piece… I’m sure he didn’t have a terrorist knife, so he killed a lot of people.”

The speaker on another recording described it as “two girls and one more lady.” Whisper invented extra comments about race, adding “two more girls and a lady were, um, black.”

In a third transcript, Whisper invented a non-existent drug called “hyperactive antibiotics.”

Researchers don’t know why Whisper and similar tools hallucinate, but software developers say the hallucinations often occur during pauses, background noise, or while music is playing.

In its online remarks, OpenAI recommended that Whisper not be used in “decision-making contexts where flaws in accuracy could lead to significant flaws in results.”

That warning hasn’t stopped hospitals or medical centers from using speech-to-text models, including Whisper, to transcribe what is said during doctor visits so medical providers can spend less time taking notes or writing reports.

More than 30,000 clinicians and 40 health systems, including the Mankato Clinic in Minnesota and Children’s Hospital Los Angeles, have begun using a Whisper-based tool developed by . NeblaWith offices in France and the USA

Martin Raison, Nabla’s chief technology officer, said the tool is finely tuned to medical language to transcribe and summarize patients’ interactions.

Company officials said they were aware Whisper could hallucinate and were mitigating the problem.

Raison said it was impossible to compare Nabla’s AI-generated transcription to the original recording because Nabla’s tool deleted the original audio “for data security reasons.”

The tool has been used to transcribe an estimated 7 million medical visits, Nabla said.

Former OpenAI engineer Saunders said deleting original audio could be concerning if transcripts aren’t double-checked or if clinicians can’t access the recording to verify its accuracy.

“If you take away the ground truth, you can’t catch errors,” he said.

No model is perfect, Nabla said, and theirs currently requires medical providers to quickly compile and approve written notes, but that could change.

Because patients’ conversations with their doctors are confidential, it is difficult to know how AI-generated transcripts affect them.

California state lawmaker Rebecca Bauer-KahanHe said he took one of his children to the doctor earlier this year and refused to sign the health network form on the condition that he get permission to share his consulting audio with vendors including Microsoft Azure, the cloud computing system operated by OpenAI’s largest investor. . Bauer-Kahan said he did not want such intimate medical conversations to be shared with technology companies.

“The release was very specific that for-profit companies would have the right to have this,” said Bauer-Kahan, a Democrat who represents part of the San Francisco suburbs in the State Assembly. “I said, ‘Absolutely not.’”

John Muir Health spokesman Ben Drew said the health system complies with state and federal privacy laws.

___

Schellmann reported from New York.

___

This story was produced in partnership with the Pulitzer Center’s AI Accountability Network, which also supported the academic Whisper study in part.

___

The Associated Press receives financial assistance from the Omidyar Network to support coverage of artificial intelligence and its impact on society. AP is solely responsible for all content. find APs standards for working with philanthropists, a list of supporters and areas of funded coverage AP.org.

___

Associated Press and OpenAI license and technology agreement Allowing OpenAI access to some of AP’s text archives.