EMNLP 2025

November 07, 2025

Suzhou, China

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

Procedural text adaptation—such as modifying recipes or revising instructional guides—has traditionally relied on specialized models extensively fine‑tuned for specific domains. To address the scalability limitations of such approaches, recent research has increasingly turned to general‑purpose large language models (LLMs). However, existing prompting strategies for LLMs often yield superficial or erroneous adaptations due to alignment‑induced biases and the inherent complexity of procedural editing. To overcome these challenges, we propose the Over‑generation‑and‑Compaction (OC) prompting strategy, which first elicits an exhaustive set of procedural details to leverage the model’s latent knowledge, and subsequently compacts them into concise, coherent adaptations. We further introduce Recipe Consistency & Feasibility (RCF), a novel metric for systematically assessing procedural validity and practicality in cooking recipe adaptations. Experiments on public datasets demonstrate that OC significantly improves adaptation consistency and feasibility compared to baseline prompting methods, without the need for additional fine-tuning or curated training resources.

Downloads

SlidesPaperTranscript English (automatic)

Next from EMNLP 2025

TransBERT: A Framework for Synthetic Translation in Domain-Specific Language Modeling
poster

TransBERT: A Framework for Synthetic Translation in Domain-Specific Language Modeling

EMNLP 2025

+2Julien Knafou
Alexandre Flament and 4 other authors

07 November 2025

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved