While I haven't seen any transcription hallucinations that were concerning like those mentioned in the article, I did just become aware of this phenomenon tonight when my 7yo daughter was asking ChatGPT voice chat questions about birds.
===
Daughter: Did cats prey on the Dodo birds' eggs?
ChatGPT: Yes, cats, along with other introduced animals like rats and pigs, did prey on dodo eggs. [continues]
Daughter: Thank you for watching and please don't forget to like, share, and subscribe!
ChatGPT: You're welcome! It sounds like you're wrapping up a video. I hope it goes well! If you have any more questions or need further information in the future, feel free to ask. Happy sharing!
===
That second question is nowhere in the ballpark of what she asked, which was something about birds evolving into dinosaurs.
I found it super interesting that the transcription seemed to... hallucinate on rails (?) into a perfect sentence that had nothing to do with what was said as opposed to—if it couldn't make sense of the utterance—a garbled sentence that attempted to match the sounds verbatim.
===
Daughter: Did cats prey on the Dodo birds' eggs?
ChatGPT: Yes, cats, along with other introduced animals like rats and pigs, did prey on dodo eggs. [continues]
Daughter: Thank you for watching and please don't forget to like, share, and subscribe!
ChatGPT: You're welcome! It sounds like you're wrapping up a video. I hope it goes well! If you have any more questions or need further information in the future, feel free to ask. Happy sharing!
===
That second question is nowhere in the ballpark of what she asked, which was something about birds evolving into dinosaurs.
I found it super interesting that the transcription seemed to... hallucinate on rails (?) into a perfect sentence that had nothing to do with what was said as opposed to—if it couldn't make sense of the utterance—a garbled sentence that attempted to match the sounds verbatim.
reply