EMNLP 2025

November 06, 2025

Suzhou, China

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

Retrieving entity knowledge that aligns with user intent is essential for task-oriented dialogue (TOD) systems to support personalization and localization, especially under large-scale knowledge bases. However, generative models tend to suffer from implicit association preference, while retrieval-generation approaches face knowledge transfer discrepancies. To address these challenges, we propose CaTER, a Context-aware Topology Entity Retrieval Contrastive Learning Framework. CaTER introduces a cycle context-aware distilling attention mechanism, which employs context-independent sparse pooling to suppress noise from weakly relevant attributes. We further construct topologically hard negative samples by decoupling entity information from generated responses and design a topology entity retrieval contrastive loss to train the retriever by reverse distillation. Extensive experiments on three standard TOD benchmarks with both small and large-scale knowledge bases show that CaTER consistently outperforms strong baselines such as MAKER and MK-TOD, achieving state-of-the-art performance in TOD system.

Downloads

SlidesPaperTranscript English (automatic)

Next from EMNLP 2025

Attribution and Application of Multiple Neurons in Multimodal Large Language Models
poster

Attribution and Application of Multiple Neurons in Multimodal Large Language Models

EMNLP 2025

+1Pengyuan Liu
Pengyuan Liu and 3 other authors

06 November 2025

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2026 Underline - All rights reserved