Tech

OpenAI’s Whisper transcription tool has hallucination issues, researchers say

Software program engineers, builders, and tutorial researchers have critical issues about transcriptions from OpenAI’s Whisper, in line with a report within the Related Press.

Whereas there’s been no scarcity of dialogue round generative AI’s tendency to hallucinate — mainly, to make stuff up — it’s a bit shocking that this is a matter in transcription, the place you’d anticipate the transcript carefully observe the audio being transcribed.

As a substitute researchers instructed the AP that Whisper has launched all the things from racial commentary to imagined medical therapies into transcripts. And that may very well be notably disastrous as Whisper is adopted in hospitals and different medical contexts.

A College of Michigan researcher finding out public conferences discovered hallucinations in eight out of each 10 audio transcriptions. A machine studying engineer studied greater than 100 hours of Whisper transcriptions and located hallucinations in additional than half of them. And a developer reported discovering hallucinations in almost all of the 26,000 transcriptions he created with Whisper.

An OpenAI spokesperson mentioned the corporate is “frequently working to enhance the accuracy of our fashions, together with lowering hallucinations” and famous that its utilization insurance policies prohibit utilizing Whisper “in sure high-stakes decision-making contexts.”

“We thank researchers for sharing their findings,” they mentioned.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button