Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Continual test-time adaptation (CTTA) enables online model adjustment under dynamic distribution shifts in real-world environments. However, most existing CTTA frameworks adopt fixed model architectures, lacking the structural flexibility required for deployment across heterogeneous edge devices with varying computational capacities. To address this, we propose an elastic framework for edge CTTA that performs resource-aware dynamic model search based on a pre-trained binary Supernet. This enables architectural flexibility by generating personalized models tailored to the resource constraints of different edge devices. Considering the evolving distribution of unlabeled data on edge devices during deployment, we introduce a pluggable lightweight fine-tuning mechanism. By inserting low-rank adapters into the frozen binary backbone, the model enables continual self-supervised adaptation with minimal computational overhead. In addition, we propose a structure-aware knowledge reflux mechanism that transfers the adaptation experience from fine-tuned edge models back into the Supernet. By distilling knowledge into structurally aligned Supernet paths, future architecture search is improved without requiring retraining. Experiments on multiple benchmarks validate that our method achieves state-of-the-art performance while significantly reducing resource consumption, with re-searched models after knowledge reflux showing further improvements.
