Skip to content
Back to glossary

Glossary

Hallucination (AI)

AI hallucination is when an AI model generates information that sounds plausible but is incorrect or fabricated.

AIriskkvalitet

AI hallucination is when an AI model generates information that sounds plausible but is incorrect or entirely fabricated. It's a well-known problem with LLMs like ChatGPT, Claude, and Gemini.

Why is understanding hallucinations important?

For businesses relying on AI for content creation or customer interaction, hallucinations can lead to misinformation that damages credibility. Techniques like RAG and fact-checking help reduce the risk.