Generative pre-trained transformer

From ACT Wiki
Revision as of 21:26, 4 October 2023 by Doug (talk | contribs) (Add link.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

Information technology - software - natural language processing - artificial intelligence - chatbots.

(GPT).

Generative pre-trained transformers are language models that have been pre-trained on large datasets of unlabelled natural language text.

They can generate new text that is human-like, and in some cases may be difficult to distinguish from human-written text.


GPT's unsupervised pre-training may then often be supplemented by additional fine-tuning human supervised training, known as Reinforcement Learning from Human Feedback (RLHF).


See also


Other resource