What is a Hallucination?
In machine learning, an instance where a model generates an output that is factually incorrect or misleading, despite appearing plausible. Mitigating hallucinations is a key aspect of responsible AI development.
In machine learning, an instance where a model generates an output that is factually incorrect or misleading, despite appearing plausible. Mitigating hallucinations is a key aspect of responsible AI development.