Composing without forgetting
In this project, we propose a modular continual learning approach to face the problem of catastrophic forgettingand transfer in learning from evolving task distributions. Concretely, we propose a model that learns how to selectmost relevant modules based on a local decision rule for a given task to form a deep learning model for solving agiven task. In this framework we generalization to unseen but related tasks emerge through the composition ofthose modules. Additionally, we exploit self-supervised learning to further boost performance through test-timeself-supervised finetuning (active remembering). This is of vital importance for Element AI to provide reusablesolutions that scale with new data, without the need of learning a new model for every problem and improving theoverall performance.