Hallucination (AI)

Definition

In the field of artificial intelligence, hallucination refers to the phenomenon where a language model (LLM) generates content that may sound linguistically correct and convincing but is factually inaccurate, unverifiable, or entirely fabricated. The cause lies in the statistical nature of text generation, which does not guarantee truthfulness or reliable sourcing.

Examples

  • An AI invents studies or sources that do not exist
  • False facts about companies, products, or people
  • Generation of data or figures without any real basis

Benefits (of awareness about hallucinations)

  • Critical review and quality assurance of AI outputs
  • Training employees in handling AI-generated texts
  • Avoiding reputational damage caused by false information