Instruction Pre-Training: Language Models are Supervised Multitask Learners
- https://arxiv.org/abs/2406.14491
- Daixuan Cheng, Yuxian Gu, Shaohan Huang, Junyu Bi, Minlie Huang, Furu Wei
Instruction tuning, How to pretrain transformer models, LLMs
Instruction tuning, How to pretrain transformer models, LLMs