Hallucination

From ACT Wiki
Revision as of 14:44, 11 August 2024 by Doug (talk | contribs) (Create page. Sources: Linked pages.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

Treasury - automation - information technology - artificial intelligence - large language models.

In the context of artificial intelligence and large language models, a hallucination is an inaccurate output.

Some hallucinations may be relatively easy for subject matter experts to identify.

Other hallucinations may pass through systems without being identified.


The existence of hallucinations is one reason - among other good reasons - for retaining a human in the loop.


See also