Hacker News new | past | comments | ask | show | jobs | submit login

> "Hallucination" is only a problem if later layers (or additional networks) can't detect and remove it

Yeah I mean sure. Anything is only a problem if it goes undetected. The issue is that if you rely on statistical model, you’ll always have hallucinations, so you can’t filter statistical output with another statistical model if you need real guarantees.

Many products don’t need those guarantees though.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: