Ideas to add on
- What are LLMs?
- What’s the history behind them?
- How are they evolving now?
Interesting terms
- Catastrophic inteference/forgetting — tendency for knowledge of the previously learned task(s) to be abruptly lost as information relevant to the current task is incorporated1
- Knowledge distillation — technique that transfers the learnings of a large pre-trained model (“teacher model”) to a smaller one (“student model”)2
Footnotes
-
As defined in this research paper ↩
-
As defined in this IBM blog post ↩