AAAI 2026

January 22, 2026

Singapore, Singapore

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

Evolutionary algorithms (EAs) are optimization algorithms that simulate natural selection and genetic mechanisms. Despite advancements, existing EAs have two main issues: (1) they rarely update next-generation individuals based on global correlations, thus limiting comprehensive learning; (2) it is challenging to balance exploration and exploitation, excessive exploitation leads to premature convergence to local optima, while excessive exploration results in an excessively slow search. Existing EAs heavily rely on manual parameter settings, inappropriate parameters might disrupt the exploration-exploitation balance, further impairing model performance. To address these challenges, we propose a novel evolutionary algorithm framework called Graph Neural Evolution (GNE). Unlike traditional EAs, GNE represents the population as a graph, where nodes correspond to individuals, and edges capture their relationships, thus effectively leveraging global information. Meanwhile, GNE utilizes spectral graph neural networks (GNNs) to decompose evolutionary signals into their frequency components and designs a filtering function to fuse these components. High-frequency components capture diverse global information, while low-frequency components capture more consistent information. This explicit frequency filtering strategy directly controls global-scale features through frequency components, overcoming the limitations of manual parameter settings and making the exploration-exploitation control more interpretable and effective. Extensive evaluations on nine benchmark functions (e.g., Sphere, Rastrigin, and Rosenbrock) demonstrate that GNE consistently outperforms both classical algorithms (GA, DE, CMA-ES) and advanced algorithms (SDAES, RL-SHADE) under various conditions, including original, noise-corrupted, and optimal solution deviation scenarios. GNE achieves solution quality several orders of magnitude better than other algorithms (e.g., 3.07e-20 mean on Sphere vs. 1.51e-07).

Downloads

Paper

Next from AAAI 2026

Injection, Attack and Erasure: Revocable Backdoor Attacks via Machine Unlearning
poster

Injection, Attack and Erasure: Revocable Backdoor Attacks via Machine Unlearning

AAAI 2026

Baogang Song

22 January 2026

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved