AAAI 2026

January 23, 2026

Singapore, Singapore

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

Most state-of-the-art time series imputation methods can leverage textual information to improve imputation quality, but they often struggle for failing to effectively filter noisy information from large language model (LLM) derived textual information. Some existing solutions only filter from the entire token set, which can introduce erroneous conditional constraints, extreme token frequency effects, and increased computational complexity. Based on this, we propose CaT-Diff, a novel cascaded text-enhanced diffusion model for probabilistic imputation of multivariate time series under Missing Not At Random (MNAR) scenarios. To suppress irrelevant semantics and focus on context most predictive of missing values, CaT-Diff introduces an innovative Hierarchical Semantic Filter (HSF) that collaborates with a Mixture-of-Experts (MoE) Network. The MoE projects heterogeneous text embeddings into the time series latent space, and the HSF cascade-filters text embeddings from the segment level to the token level, thereby avoiding the pitfalls of direct token-level filtering and reducing overhead. We also incorporate a lightweight Missing Mechanism Estimator, jointly optimized with the denoising network to explicitly capture MNAR missingness patterns. Extensive tests on nine domains show CaT-Diff outperforms state-of-the-art baselines, cutting MSE by 14.7\% and MAE by 7.6\% relative to the next-best baselines. Our work establishes a new paradigm for selectively fusing LLM-derived textual information. Our anonymous code is provided in the supplementary materials.

Downloads

Paper

Next from AAAI 2026

Res-Bench: Benchmarking the Robustness of Multimodal Large Language Models to Dynamic Resolution Input
poster

Res-Bench: Benchmarking the Robustness of Multimodal Large Language Models to Dynamic Resolution Input

AAAI 2026

+3
Yanbin Hao and 5 other authors

23 January 2026

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved