Content not yet available

This lecture has no active video or poster.

AAAI 2026

January 25, 2026

Singapore, Singapore

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

The learning dynamics of modern neural networks remain an open problem in deep learning. The Neural Tangent Kernel (NTK) offers an elegant description of training dynamics in the infinite‑width limit, yet its classical formulation assumes a static data set. Modern model training practice departs from this strong assumption through the use of on‑the‑fly data augmentations (e.g. additive noise). In this work, we conduct an NTK-driven analysis of how data transformations affect a neural net's evolution in the function space. Our theoretical contributions characterize how repeated Gaussian perturbations from NTK-derived covariances can steer neural-net optimizations toward user‑specified behavior. These theoretical insights are empirically validated by controlled experiments. Taken together, our results lay the foundation for a promising future research direction that transforms the NTK from a descriptive to a prescriptive tool, enabling control of neural net training trajectories and behavior of inference generalization with grounded interventions.

Downloads

Paper

Next from AAAI 2026

PocketLLM: Ultimate Compression of Large Language Models via Meta Networks
poster

PocketLLM: Ultimate Compression of Large Language Models via Meta Networks

AAAI 2026

+2
Kai Han and 4 other authors

25 January 2026

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved