Hacker News new | past | comments | ask | show | jobs | submit login
AI transcription tools 'hallucinate,' too (science.org)
5 points by 1970-01-01 11 days ago | hide | past | favorite | 2 comments





While I haven't seen any transcription hallucinations that were concerning like those mentioned in the article, I did just become aware of this phenomenon tonight when my 7yo daughter was asking ChatGPT voice chat questions about birds.

===

Daughter: Did cats prey on the Dodo birds' eggs?

ChatGPT: Yes, cats, along with other introduced animals like rats and pigs, did prey on dodo eggs. [continues]

Daughter: Thank you for watching and please don't forget to like, share, and subscribe!

ChatGPT: You're welcome! It sounds like you're wrapping up a video. I hope it goes well! If you have any more questions or need further information in the future, feel free to ask. Happy sharing!

===

That second question is nowhere in the ballpark of what she asked, which was something about birds evolving into dinosaurs.

I found it super interesting that the transcription seemed to... hallucinate on rails (?) into a perfect sentence that had nothing to do with what was said as opposed to—if it couldn't make sense of the utterance—a garbled sentence that attempted to match the sounds verbatim.


I appreciate the quotes on hallucinate. These robots are just making shit up.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: