Hallucination
From ACT Wiki
Treasury - automation - information technology - artificial intelligence - large language models.
In the context of artificial intelligence and large language models, a hallucination is an inaccurate output.
Some hallucinations may be relatively easy for subject matter experts to identify.
Other hallucinations may pass through systems without being identified.
The existence of hallucinations is one reason - among other good reasons - for retaining a human in the loop.