EMNLP 2025

November 06, 2025

Suzhou, China

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

This paper introduces an algorithm to select demonstration examples for in-context learning of a query set. Given a set of n examples, how can we quickly select k out of n to best serve as the conditioning for a downstream task? This problem has broad applications in prompt tuning and chain-of-thought reasoning, to name a few. Since model weights remain fixed during in-context learning, previous work has sought to design methods based on similarity scores measured in the input embeddings. This work proposes a new approach based on gradients of the model output taken in the input embedding space. Our approach estimates model outputs through a first-order approximation using the gradients. Then, we apply this estimation to multiple randomly sampled subsets. Finally, we aggregate the sampled subset outcomes to form an influence score for each demonstration, and select k most relevant examples. This procedure only requires pre-computing model outputs and gradients once, leading to a linear-time algorithm relative to model and training set sizes. Extensive experiments across various LLMs and datasets validate the efficiency of our approach. We show that the gradient estimation procedure yields approximations of full inference with less than 1\% error across six datasets. This allows us to scale up subset selection methods that would otherwise run full inference by up to 37.7times on LLMs with up to 34 billion parameters, and outperform existing selection methods based on input embeddings by 11\% on average.

Downloads

SlidesPaperTranscript English (automatic)

Next from EMNLP 2025

Grammar Pruning: Enabling Low-Latency Zero-Shot Task-Oriented Language Models for Edge AI
poster

Grammar Pruning: Enabling Low-Latency Zero-Shot Task-Oriented Language Models for Edge AI

EMNLP 2025

+2
Alexandru Nicolau and 4 other authors

06 November 2025

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved