Generative pre-trained transformer

From ACT Wiki
Jump to navigationJump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

Information technology - software - natural language processing - artificial intelligence - chatbots.

(GPT).

Generative pre-trained transformers are language models that have been pre-trained on large datasets of unlabelled natural language text.

They can generate new text that is human-like, and in some cases may be difficult to distinguish from human-written text.


GPT's unsupervised pre-training may then often be supplemented by additional fine-tuning human supervised training, known as Reinforcement Learning from Human Feedback (RLHF).


See also


Other resource