This repository contains the PyTorch code for NeurIPS 2020, 4th Workshop on Meta-Learning paper:
Few-Shot Unsupervised Continual Learning through Meta-Examples
Alessia Bertugli,
Stefano Vincenzi,
Simone Calderara,
Andrea Passerini
Scheme of FUSION-ME. The model is composed of 4 phases: embedding learning network phase, unsupervised task construction phase, meta-continual training phase and meta-continual test phase.
- Python >= 3.8
- PyTorch >= 1.5
- CUDA 10.0
You can generate embeddings for Mini-ImageNet and SlimageNet64 using the code of DeepCluster and for Omniglot the code of ACAI or download them from here.
Available soon.
- Download the embeddings from the link above, then set the
data_folder
variable in theget_embeddings
function contained in thedataset/utils.py
file equal to your dataset path; - in the file
trainers/fusion.py
set the arg--dataset
equal to the dataset name you want to train on (e.g. Omniglot or Imagenet); - set the arg
--attention
to exploit the meta-examples and--num_clusters
to the desired number of clusters; - run the file
trainers/fusion.py
.
- Note that the unsupervised task construction is carried out by the function
cactus_unbalance
defined in the filedataset/dataset_factory
and executed in thetrainers/fusion.py
file.
- CACTUs-MAML: Clustering to Automatically Generate Tasks for Unsupervised Model-Agnostic Meta-Learning
- Meta-Learning Representations for Continual Learning
- Deep Clustering for Unsupervised Learning of Visual Features
If you have any questions, please contact [email protected] or [email protected], or open an issue on this repo.
If you find this repository useful for your research, please cite the following paper:
@article{Bertugli2020fusion-me,
title={Few-Shot Unsupervised Continual Learning through Meta-Examples},
author={Alessia Bertugli and Stefano Vincenzi and Simone Calderara and Andrea Passerini},
journal={34rd Conference on Neural Information Processing Systems (NeurIPS 2020), 4th Workshop on Meta-Learning},
year={2020},
volume={abs/2009.08107}
}