EMNLP 2025

November 07, 2025

Suzhou, China

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

In-context learning is an emergent learning paradigm that enables an LLM to learn an unseen task by seeing a number of demonstrations in the context window. The quality of the demonstrations is of paramount importance as 1) context window size limitations restrict the number of demonstrations that can be presented to the model, and 2) the model must identify the task and potentially learn new, unseen input-output mappings from the limited demonstration set. An increasing body of work has also shown the sensitivity of predictions to perturbations on the demonstration set. Given this importance, this work presents a survey on the current literature pertaining to the relationship between data and in-context learning. We present our survey in three parts: the "good" -- qualities that are desirable when selecting demonstrations, the "bad" -- qualities of demonstrations that can negatively impact the model, as well as issues that can arise in presenting demonstrations, and the "debatable" -- qualities of demonstrations with mixed results or factors modulating data impacts.

Downloads

SlidesPaperTranscript English (automatic)

Next from EMNLP 2025

Model Consistency as a Cheap yet Predictive Proxy for LLM Elo Scores
poster

Model Consistency as a Cheap yet Predictive Proxy for LLM Elo Scores

EMNLP 2025

Nestor Demeure and 2 other authors

07 November 2025

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved