Everything Everywhere All at Once: LLMs can In-Context Learn Multiple Tasks in Superposition
- https://arxiv.org/abs/2410.05603
- Zheyang Xiong, Ziyang Cai, John Cooper, Albert Ge, Vasilis Papageorgiou, Zack Sifakis, Angeliki Giannou, Ziqian Lin, Liu Yang, Saurabh Agarwal, Grigorios G Chrysos, Samet Oymak, Kangwook Lee, Dimitris Papailiopoulos
LLMs, In context learning, Task superposition
LLMs can perform multiple, computationally distinct ICL tasks simultaneously, during a single inference call, a capability we term “task superposition”.