News Review - OpenAI’s Transcription Tool Hallucinates. Hospitals Are Using It Anyway
- James Zheng
- Nov 4, 2024
- 1 min read
Updated: Mar 4, 2025
While OpenAI warns that Whisper can introduce “hallucination” - irrelevant or false content - into the transcription it provides, certain companies still use this AI tool in high-risk domains like the healthcare industry driven by cost reduction needs.
This piece of news resonates with me because back in 2023 when I was working at AstraZeneca China, we interviewed several executives to figure out whether and how generative AI was integrated into current business models. We were told although this cool technology was integrated into several work loops like R&D and diagnosis, it was always monitored by health professionals and served as an auxiliary in decision-making, for we should never put the patients at risk of suffering negative outcomes due to accidental AI errors.
In light of this, I condemn some companies’ irresponsible adoption of generative AI in high-risk fields, particularly life-related industries. Not only are they hurting their brand, alienating their customers, and leading to a long-term revenue loss, but failing to empathize with the vulnerable, while empathy is deemed as the gem of humanity, especially in this world where AI is outsmarting most human beings.
(Image Source: Wired.com)



Comments