Interesting terms

  • Catastrophic inteference/forgetting — tendency for knowledge of the previously learned task(s) to be abruptly lost as information relevant to the current task is incorporated1
  • Knowledge distillation — technique that transfers the learnings of a large pre-trained model (“teacher model”) to a smaller one (“student model”)2

Footnotes

  1. As defined in this research paper

  2. As defined in this IBM blog post