Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Temporal graphs are essential for modeling complex real-world systems, such as social interactions, financial transactions, and recommendation system, but the high computational cost and model complexity pose practical challenges for deploying dynamic graph neural networks (DGNNs). Although various pruning and sampling techniques have proven effective in accelerating static GNNs, these approaches fall short in dynamic settings due to temporal dependencies in evolving graph structures. To address these challenges, we propose TrimDG, a general framework that accelerates DGNNs by eliminating both static and runtime redundancy. For static redundancy, we design a novel node influence metric, Temporal Personalized PageRank (TPP), to prune less informative nodes and apply temporal binning to remove redundant events. For runtime redundancy during training, we introduce an adaptive sampling strategy guided by graph bottlenecks and reduce sampling frequency by temporal batch selector and sampling cache. Theoretical analysis supports our design, and experiments on real-world datasets show that TrimDG reduces runtime by an average of 83.80\% across diverse DGNN backbones, while maintaining strong predictive performance, demonstrating both its efficiency and generalizability.