AAAI 2026 Main Conference

January 22, 2026

Singapore, Singapore

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

This research proposes an extension to the Program Lattice Transformer (PLT) , a neuro-symbolic framework for program induction that embeds programs into a structured latent space. The current PLT model, which uses a flat lattice, is computationally inefficient when modeling invariant programs—operations that return to an initial state after a set number of applications (e.g., a 360° rotation). To address this, we propose embedding the program space onto a cylindrical manifold instead of a plane. This approach is grounded in the principle that only isometric transformations preserve the lattice's compositional structure, limiting valid manifolds to developable surfaces like cylinders . A cylindrical geometry naturally represents invariant programs as closed loops, enhancing efficiency. The proposed method will be evaluated on synthetic tasks like Rubik's Cube and the Abstraction and Reasoning Corpus (ARC) to demonstrate improved performance and efficiency. This work serves as a step toward models that can autonomously configure their own geometric latent spaces, connecting to future research in geometric deep learning and meta-learning.

Downloads

Paper

Next from AAAI 2026 Main Conference

Scale Regularization for Stable Low-Rank Adaptation
poster

Scale Regularization for Stable Low-Rank Adaptation

AAAI 2026 Main Conference

Ian Tan

22 January 2026

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved