Content not yet available
This lecture has no active video or poster.
Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Tensor network structure search (TN-SS) aims to automatically discover optimal network topologies and rank configurations for efficient tensor decomposition in high-dimensional data representation. Despite recent advances, existing TN-SS methods face significant limitations in computational tractability, structure adaptivity, and optimization robustness across diverse tensor characteristics. Current approaches struggle with three fundamental challenges: single-scale optimization that misses multi-scale structures, discrete search spaces that prevent smooth structure evolution, and separation of structure and parameter optimization that creates computational inefficiency. We propose RGTN (\textbf{R}enormalization \textbf{G}roup guided \textbf{T}ensor \textbf{N}etwork search), a novel physics-inspired framework that fundamentally transforms tensor network structure search through multi-scale renormalization group flows. Unlike existing methods that search through discrete structure spaces at fixed scales, RGTN implements a dynamic scale-transformation strategy where network structures evolve continuously across resolution levels. The key innovation lies in introducing learnable edge gates that enable topology modification during optimization, combined with intelligent structure proposals based on physical quantities—node tension measuring local stress and edge information flow quantifying connectivity importance. By starting optimization at coarse scales with exponentially reduced complexity and progressively refining toward finer scales, RGTN discovers more compact structures while naturally escaping local minima through scale-induced perturbations. Our code is available in the supplementary materials for reproducibility.