Nội dung text [Shared] - GPT.pdf
GPT Lecturer: Ngoc Ba VietAI Teaching Team Founder @ ProtonX Video
Lịch sử GPT 2018 2019 2020 11/2022 GPT-1 GPT-2 GPT-3 ChatGPT Improving Language Understanding by Generative Pre-Training Language Models are Unsupervised Multitask Learners Experimentation on large datasets and data preparation techniques for training a model on multiple different tasks. 4/2022 InstructionGPT Language Models are Few-Shot Learners Continuing the breakthroughs with the use of zero-shot/one-shot/few-shot learning instead of fine-tuning the model. Training language models to follow instructions with human feedback Incorporating human-loop and reinforcement learning into model training to avoid generating bad/poisoned information. Discovery that Transformer Decoder can perform multiple natural language tasks independently without the use of an Encoder. Video