Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Graph Neural Networks (GNNs) have demonstrated strong performance across various tasks by leveraging the structural information inherent in graph-structured data. To address the challenge of edge heterophily, where connected nodes may have dissimilar labels or features, two main families of GNNs have emerged: Mixture-of-Experts (MoE) based spatial GNNs and frequency filtering based spectral GNNs. While MoE-based spatial GNNs intuitively assign experts to different hops without solid theoretical grounding, spectral GNNs are based on principled insights from graph signal processing but often rely on manually designed filters and global operators, limiting their scalability and adaptability. In this work, we identify an inherent connection between these two families by showing that the eigengraph components in spectral methods can be treated as experts within an MoE framework. Building on this insight, we propose MORGAN, a novel spectral GNN that integrates Mixture-of-Experts into the spectral domain. MORGAN performs eigen-decomposition of the graph Laplacian, partitions the spectrum into multiple frequency bands, and assigns a dedicated expert network to each band. A learnable gating function dynamically combines these experts based on the spectral characteristics of the input. To support scalable and inductive learning, we further develop MORGAN(L), which incorporates subgraph sampling to enable localized spectral filtering without requiring full access to the graph Laplacian. Extensive experiments on 16 real-world benchmark datasets show that MORGAN achieves competitive or superior performance compared to state-of-the-art baselines, particularly in inductive node classification under heterophilic settings.