Content not yet available

This lecture has no active video or poster.

AAAI 2026 Main Conference

January 22, 2026

Singapore, Singapore

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

Recent works have evidenced how a sequential fine-tuning (SeqFT) phase of pre-trained vision transformers (ViTs) followed by a classifier refinement process through approximate distributions of class features, offers effective solutions to class incremental learning (CIL). However, this approach suffers from distribution drift due to the sequential optimization of shared backbone parameters, leading to a mismatch between the approximate distributions of previous classes and those of the updated model. This distribution mismatch generally leads to degraded performance in classifier refinement over time. To tackle this issue, we introduce the latent space transition operator, built on which we propose the Sequential Learning with Drift Compensation (SLDC) method. First, the linear SLDC method, which estimates a linear operator, is developed by solving a regularized least-squares problem between pre- and post-optimization features. Hereafter, the weak-nonlinear SLDC method, which assumes that appropriate transition operators are located at the intersection between linear and nonlinear regions, is developed by constructing learnable weak-nonlinear transformations. Finally, in both variants, knowledge distillation (KD) is applied to further mitigate the representation drift. Extensive experiments on CIL benchmarks demonstrate that SLDC significantly enhances the performance of SeqFT. Notably, by combining KD (to reduce representation drift) with SLDC (to counteract distribution drift), SeqFT achieves comparable performance to joint training across all evaluated datasets.

Downloads

Paper

Next from AAAI 2026 Main Conference

Correspondence Coverage Matters for Multi-Modal Dataset Distillation
poster

Correspondence Coverage Matters for Multi-Modal Dataset Distillation

AAAI 2026 Main Conference

+4Chengyou JIAZhuohang DANG
Xiaojun Chang and 6 other authors

22 January 2026

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved