Generative pre-trained transformer

From ACT Wiki
Revision as of 20:55, 8 April 2023 by imported>Doug Williamson (Create page - sources - Wikipedia https://en.wikipedia.org/wiki/Generative_pre-trained_transformer and ACT - https://www.treasurers.org/hub/treasurer-magazine/chatgpt-coming-treasurers)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

Information technology - software - natural language processing - artificial intelligence - chatbots.

(GPT).

Generative pre-trained transformers are language models that have been pre-trained on large datasets of unlabelled natural language text.

They can generate new text that is human-like, and in some cases may be difficult to distinguish from human-written text.


This pre-training may then be supplemented by additional fine-tuning human supervised training, known as Reinforcement Learning from Human Feedback (RLHF).


See also