Category: Publications

  • Transferring Unconditional to Conditional GANs with Hyper-Modulation

    Héctor Laria, Yaxing Wang, Joost van de Weijer, Bogdan Raducanu Read Full Paper → GANs have matured in recent years and are able to generate high-resolution, realistic images. However, the computational resources and the data required for the training of high-quality GANs are enormous, and the study of transfer learning of these models is therefore an urgent topic. […]

  • Class-Balanced Active Learning for Image Classification

    Javad Zolfaghari Bengar, Joost van de Weijer, Laura Lopez Fuentes, Bogdan Raducanu Read Full Paper → Active learning aims to reduce the labeling effort that is required to train algorithms by learning an acquisition function selecting the most relevant data for which a label should be requested from a large unlabeled data pool. Active learning is generally studied […]

  • HCV: Hierarchy-Consistency Verification for Incremental Implicitly-Refined Classification

    Kai Wang, Xialei Liu, Luis Herranz, Joost van de Weijer Read Full Paper → Human beings learn and accumulate hierarchical knowledge over their lifetime. This knowledge is associated with previous concepts for consolidation and hierarchical construction. However, current incremental learning methods lack the ability to build a concept hierarchy by associating new concepts to old ones. A more […]

  • Exploiting the Intrinsic Neighborhood Structure for Source-free Domain Adaptation

    Shiqi Yang, Yaxing Wang, Joost van de Weijer, Luis Herranz, Shangling Jui Read Full Paper → Domain adaptation (DA) aims to alleviate the domain shift between source domain and target domain. Most DA methods require access to the source data, but often that is not possible (e.g. due to data privacy or intellectual property). In this paper, we address […]

  • Generalized Source-free Domain Adaptation

    Shiqi Yang, Yaxing Wang, Joost van de Weijer, Luis Herranz, Shangling Jui Read Full Paper → Domain adaptation (DA) aims to transfer the knowledge learned from a source domain to an unlabeled target domain. Some recent works tackle source-free domain adaptation (SFDA) where only a source pre-trained model is available for adaptation to the target domain. However, those methods […]

  • TransferI2I: Transfer Learning for Image-to-Image Translation from Small Datasets

    Yaxing Wang, Hector Laria Mantecon, Joost van de Weijer, Laura Lopez-Fuentes, Bogdan Raducanu Read Full Paper → Image-to-image (I2I) translation has matured in recent years and is able to generate high-quality realistic images. However, despite current success, it still faces important challenges when applied to small domains. Existing methods use transfer learning for I2I translation, but they still require […]

  • Avalanche: an End-to-End Library for Continual Learning

    Vincenzo Lomonaco, Lorenzo Pellegrini, Andrea Cossu, Antonio Carta, Gabriele Graffieti, Tyler L. Hayes, Matthias De Lange, Marc Masana, Jary Pomponi, Gido van de Ven, Martin Mundt, Qi She, Keiland Cooper, Jeremy Forest, Eden Belouadah, Simone Calderara, German I. Parisi, Fabio Cuzzolin, Andreas Tolias, Simone Scardapane, Luca Antiga, Subutai Amhad, Adrian Popescu, Christopher Kanan, Joost van de Weijer, Tinne Tuytelaars, Davide Bacciu, Davide Maltoni Read Full Paper → Learning continually from non-stationary data streams is a long-standing goal and a challenging problem in […]

  • Ternary Feature Masks: zero-forgetting for task-incremental learning

    Marc Masana, Tinne Tuytelaars, Joost van de Weijer Read Full Paper → We propose an approach without any forgetting to continual learning for the task-aware regime, where at inference the task-label is known. By using ternary masks we can upgrade a model to new tasks, reusing knowledge from previous tasks while not forgetting anything about them. Using […]

  • Continual learning in cross-modal retrieval

    Kai Wang, Luis Herranz, Joost van de Weijer Read Full Paper → Multimodal representations and continual learning are two areas closely related to human intelligence. The former considers the learning of shared representation spaces where information from different modalities can be compared and integrated (we focus on cross-modal retrieval between language and visual representations). The latter studies […]

  • DANICE: Domain adaptation without forgetting in neural image compression

    Sudeep Katakol, Luis Herranz, Fei Yang, Marta Mrak Read Full Paper → Neural image compression (NIC) is a new coding paradigm where coding capabilities are captured by deep models learned from data. This data-driven nature enables new potential functionalities. In this paper, we study the adaptability of codecs to custom domains of interest. We show that NIC codecs […]