Interesting terms

  • Catastrophic interference/forgetting — the tendency for knowledge of the previously learned task(s) to be abruptly lost as information relevant to the current task is incorporated1.
  • Knowledge distillation — a technique that transfers the learning of a large pre-trained model (“teacher model”) to a smaller one (“student model”)2, typically with the acknowledgement of some degradation of quality compared to the original teacher model.

Footnotes

  1. As defined in this research paper

  2. As defined in this IBM blog post