AAAI 2026

January 25, 2026

Singapore, Singapore

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

The quadratic complexity of Multimodal Large Language Models (MLLMs) with respect to context length poses significant computational and memory challenges, hindering their real-world deployment. In the paper, we devise a \textbf{\textit{filter-correlate-compress}}'' framework to accelerate the MLLM by systematically optimizing multimodal context length during prefilling. The framework first implements \textbf{\textit{FiCoCo-V}}, a training-free method operating within the vision encoder. It employs a redundancy-based token discard mechanism that uses a novel integrated metric to accurately \textit{filter} out redundant visual tokens. To mitigate information loss, the framework introduces a correlation-based information recycling mechanism that allows preserved tokens to selectively recycle information from \textit{correlate}d discarded tokens with a self-preserving \textit{compress}ion, thereby preventing the dilution of their own core content. The framework's \textbf{\textit{FiCoCo-L}} variant further leverages task-aware textual priors to perform token reduction directly within the LLM decoder. Extensive experiments demonstrate that the \textit{FiCoCo} series effectively accelerates a range of MLLMs, achieves up to \textbf{14.7×} FLOPs reduction with \textbf{93.6\%} performance retention. Our methods consistently outperform state-of-the-art training-free approaches, showcasing effectiveness and generalizability across model architectures, sizes, and tasks without requiring retraining. \textit{Code is available in supplementary materials.}

Downloads

Paper

Next from AAAI 2026

REFO: Reinforced Evolutionary Faithfulness Optimization for Large Language Models
poster

REFO: Reinforced Evolutionary Faithfulness Optimization for Large Language Models

AAAI 2026

+2Xiaqiang Tang
Keyu Hu and 4 other authors

25 January 2026

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved