Artificial Intelligence

Hallucination (AI)

Also known as: Hallucination in AI & Model hallucination

Definition

Hallucination in AI refers to when a generates incorrect or fabricated information that appears plausible.

In practice

Common in generative , especially when asked about unknown or ambiguous topics.

The reality

Hallucinations are a known limitation of current AI and cannot be fully eliminated.

Also known as

Hallucination in AI & Model hallucination

Plain English

When AI makes things up.

FAQ

Common questions

A few practical answers to the questions that usually come up around this term.

What is an AI hallucination?

It is when AI generates false or made-up information.

Why do AI hallucinations happen?

Because predict likely outputs rather than verifying facts.

Are hallucinations dangerous?

They can be if outputs are trusted without verification.

How do you reduce hallucinations?

By using better , validation, and external .

Related Services

Related Guides

Related Terms

LET'S WORK TOGETHER

Ready to improve your product?

UX, research and product leadership for teams tackling complex digital services. The work usually starts where things have become harder than they need to be: unclear journeys, inconsistent products, competing priorities, or teams trying to move forward without a clear direction. I help simplify the problem, shape the right next step, and turn complexity into something people can actually use.

Previous feedback

Will Parkhouse

Senior Content Designer

01/20