Generative pre-trained transformer
From ACT Wiki
Information technology - software - natural language processing - artificial intelligence - chatbots.
(GPT).
Generative pre-trained transformers are language models that have been pre-trained on large datasets of unlabelled natural language text.
They can generate new text that is human-like, and in some cases may be difficult to distinguish from human-written text.
This pre-training may then be supplemented by additional fine-tuning human supervised training, known as Reinforcement Learning from Human Feedback (RLHF).