Content not yet available

This lecture has no active video or poster.

AAAI 2026

January 23, 2026

Singapore, Singapore

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

Theory of Mind (ToM) refers to the ability to reason about others’ mental states, such as beliefs, desires, and intentions. Equipping large language models (LLMs)-driven agents with ToM has been shown to improve their coordination in multiagent collaborative tasks. However, we find that the mismatches in ToM reasoning depth between agents—what we call misaligned ToM orders—can lead to insufficient or excessive reasoning about others, thereby impairing their coordination. To address this issue, we design an adaptive ToM (A-ToM) agent, which can align in ToM orders with its partner. Based on prior interactions, the agent estimates the partner’s likely ToM order and leverages this estimation to predict the partner’s action, thereby facilitating behavioral coordination. We conduct empirical evaluations on four multi-agent coordination tasks: a repeated matrix game, two grid navigation tasks and an Overcooked task. The results validate our findings on ToM alignment and demonstrate the effectiveness of our A-ToM agent. Furthermore, we investigate the applicability of both our findings and the A-ToM agent.

Downloads

Paper

Next from AAAI 2026

Adapt Before Continual Learning
poster

Adapt Before Continual Learning

AAAI 2026

+2
Chunhui Ding and 4 other authors

23 January 2026

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved