AAAI 2026

January 22, 2026

Singapore, Singapore

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

Knowledge editing (KE) has emerged as an effective approach for updating factual information in large language models (LLMs) without the need for full retraining. Most of the existing methods for addressing the "ripple effect" in KE adopt a chain-structured reasoning process, making them vulnerable to error accumulation from early incorrect steps. Moreover, their conflict detection mechanisms are often susceptible to the LLM's inherent confirmation bias, further undermining the reliability of the editing process. To overcome these challenges, we propose Tree of Editing (ToE), a tree-structured, retrieval-enhanced knowledge editing framework designed to support robust reasoning under factual updates. ToE expands reasoning paths using a breadth-first strategy combined with score-guided beam search, enabling diverse and error-tolerant inference. Besides, we introduce an observer to objectively update knowledge, avoiding the bias caused by LLMs' over-confidence. Experimental results on two benchmarks, namely MQuAKE-CF (targeting ripple-aware editing) and DUNE (free-form editing), demonstrate that ToE framework significantly outperforms existing methods.

Downloads

Paper

Next from AAAI 2026

BEE-RAG: Balanced Entropy Engineering for Retrieval-Augmented Generation
poster

BEE-RAG: Balanced Entropy Engineering for Retrieval-Augmented Generation

AAAI 2026

+4Ruiyang Ren
Jing Liu and 6 other authors

22 January 2026

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved