> "Hallucination" is only a problem if later layers (or additional networks) can't detect and remove it
Yeah I mean sure. Anything is only a problem if it goes undetected.
The issue is that if you rely on statistical model, you’ll always have hallucinations, so you can’t filter statistical output with another statistical model if you need real guarantees.
Yeah I mean sure. Anything is only a problem if it goes undetected. The issue is that if you rely on statistical model, you’ll always have hallucinations, so you can’t filter statistical output with another statistical model if you need real guarantees.
Many products don’t need those guarantees though.