Hallucination

From ACT Wiki
Jump to navigationJump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

Treasury - automation - information technology - artificial intelligence - large language models.

In the context of artificial intelligence and large language models, a hallucination is an inaccurate output.

Some hallucinations may be relatively easy for subject matter experts to identify.

Other hallucinations may pass through systems without being identified.


The existence of hallucinations is one reason - among other good reasons - for retaining a human in the loop.


See also