LLEDA - Lifelong Self-Supervised Domain Adaptation

Mamatha Thota*, Dewei Yi, Georgios Leontidis

*Corresponding author for this work

Research output: Working paperPreprint

13 Downloads (Pure)

Abstract

Lifelong domain adaptation remains a challenging task in machine learning due to the differences among the domains and the unavailability of historical data. The ultimate goal is to learn the distributional shifts while retaining the previously gained knowledge. Inspired by the Complementary Learning Systems (CLS) theory, we propose a novel framework called Lifelong Self-Supervised Domain Adaptation (LLEDA). LLEDA addresses catastrophic forgetting by replaying hidden representations rather than raw data pixels and domain-agnostic knowledge transfer using self-supervised learning. LLEDA does not access labels from the source or the target domain and only has access to a single domain at any given time. Extensive experiments demonstrate that the proposed method outperforms several other methods and results in a long-term adaptation, while being less prone to catastrophic forgetting when transferred to new domains.
Original languageEnglish
PublisherArXiv
Number of pages10
DOIs
Publication statusPublished - 12 Nov 2022

Keywords

  • Self-supervised learning
  • Lifelong learning

Fingerprint

Dive into the research topics of 'LLEDA - Lifelong Self-Supervised Domain Adaptation'. Together they form a unique fingerprint.

Cite this