Failures

Hallucination

1 min read

Definition

When an AI generates plausible-sounding but factually incorrect or fabricated information.

Hallucinations are a fundamental challenge with generative AI. Models produce fluent text without reliable grounding in truth.

Types

  • Factual errors: Wrong information stated confidently
  • Entity confusion: Mixing up people, places, dates
  • Citation fabrication: Inventing sources
  • Logical inconsistency: Self-contradicting outputs

Mitigation

  • RAG for grounding
  • Confidence calibration
  • Fact-checking pipelines
  • Citation requirements
failuresaccuracytrust