Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Catastrophic forgetting remains a fundamental barrier to artificial continual learning (CL) - a capability innate to humans. Existing CL methods often incur prohibitive computational costs in resource-constrained scenarios. Spiking neural networks (SNNs), with their biological plausibility and energy efficiency, offer distinct advantages for CL. Inspired by cortico-hippocampal memory mechanisms, we propose a spiking neural network framework integrating Hebbian plasticity with meta-learning, named HLML-SNN. This architecture emulates a dual-phase CL process: (1) In the short-term phase, sample-level Hebbian learning rapidly adapts to new inputs through local synaptic updates; (2) In the long-term phase, task-level meta-learning optimizes cross-task parameters using consolidated synaptic weights, mimicking cortical memory integration to refine shared representations and initialize subsequent Hebbian learning. HLML-SNN incrementally transforms short-term adaptations into stable long-term knowledge, where the synergy of rapid synaptic updates and meta-driven global optimization enables efficient continual learning while balancing stability and plasticity. Empirical results establish HLML-SNN's state-of-the-art performance across split-MNIST/CIFAR10/CIFAR100/TinyImageNet while markedly reducing training time compared to existing methods, demonstrating substantial practical potential for rapid deployment scenarios.