Category: Publications

  • Towards Exemplar-Free Continual Learning in Vision Transformers: an Account of Attention, Functional and Weight Regularization

    Francesco Pelosin, Saurav Jha, Andrea Torsello, Bogdan Raducanu, Joost van de Weijer Read Full Paper → In this paper, we investigate the continual learning of Vision Transformers (ViT) for the challenging exemplar-free scenario, with special focus on how to efficiently distill the knowledge of its crucial self-attention mechanism (SAM). Our work takes an initial step towards a surgical investigation […]

  • Continually Learning Self-Supervised Representations with Projected Functional Regularization

    Alex Gomez-Villa, Bartlomiej Twardowski, Lu Yu, Andrew D. Bagdanov, Joost van de Weijer Read Full Paper → Recent self-supervised learning methods are able to learn high-quality image representations and are closing the gap with supervised approaches. However, these methods are unable to acquire new knowledge incrementally — they are, in fact, mostly used only as a pre-training phase over […]

  • Area Under the ROC Curve Maximization for Metric Learning

    Bojana Gajić, Ariel Amato, Ramon Baldrich, Joost van de Weijer, Carlo Gatta Read Full Paper → Most popular metric learning losses have no direct relation with the evaluation metrics that are subsequently applied to evaluate their performance. We hypothesize that training a metric learning model by maximizing the area under the ROC curve (which is […]

  • Incremental Meta-Learning via Episodic Replay Distillation for Few-Shot Image Recognition

    Kai Wang, Xialei Liu, Andy Bagdanov, Luis Herranz, Shangling Jui, Joost van de Weijer Read Full Paper → Most meta-learning approaches assume the existence of a very large set of labeled data available for episodic meta-learning of base knowledge. This contrasts with the more realistic continual learning paradigm in which data arrives incrementally in the form of tasks containing disjoint […]

  • Transferring Unconditional to Conditional GANs with Hyper-Modulation

    Héctor Laria, Yaxing Wang, Joost van de Weijer, Bogdan Raducanu Read Full Paper → GANs have matured in recent years and are able to generate high-resolution, realistic images. However, the computational resources and the data required for the training of high-quality GANs are enormous, and the study of transfer learning of these models is therefore an urgent topic. […]

  • Class-Balanced Active Learning for Image Classification

    Javad Zolfaghari Bengar, Joost van de Weijer, Laura Lopez Fuentes, Bogdan Raducanu Read Full Paper → Active learning aims to reduce the labeling effort that is required to train algorithms by learning an acquisition function selecting the most relevant data for which a label should be requested from a large unlabeled data pool. Active learning is generally studied […]

  • HCV: Hierarchy-Consistency Verification for Incremental Implicitly-Refined Classification

    Kai Wang, Xialei Liu, Luis Herranz, Joost van de Weijer Read Full Paper → Human beings learn and accumulate hierarchical knowledge over their lifetime. This knowledge is associated with previous concepts for consolidation and hierarchical construction. However, current incremental learning methods lack the ability to build a concept hierarchy by associating new concepts to old ones. A more […]

  • Exploiting the Intrinsic Neighborhood Structure for Source-free Domain Adaptation

    Shiqi Yang, Yaxing Wang, Joost van de Weijer, Luis Herranz, Shangling Jui Read Full Paper → Domain adaptation (DA) aims to alleviate the domain shift between source domain and target domain. Most DA methods require access to the source data, but often that is not possible (e.g. due to data privacy or intellectual property). In this paper, we address […]

  • Generalized Source-free Domain Adaptation

    Shiqi Yang, Yaxing Wang, Joost van de Weijer, Luis Herranz, Shangling Jui Read Full Paper → Domain adaptation (DA) aims to transfer the knowledge learned from a source domain to an unlabeled target domain. Some recent works tackle source-free domain adaptation (SFDA) where only a source pre-trained model is available for adaptation to the target domain. However, those methods […]

  • TransferI2I: Transfer Learning for Image-to-Image Translation from Small Datasets

    Yaxing Wang, Hector Laria Mantecon, Joost van de Weijer, Laura Lopez-Fuentes, Bogdan Raducanu Read Full Paper → Image-to-image (I2I) translation has matured in recent years and is able to generate high-quality realistic images. However, despite current success, it still faces important challenges when applied to small domains. Existing methods use transfer learning for I2I translation, but they still require […]