profile picture

Zhaofeng Wu

MIT

efficiency

document

machine translation

transformers

zero-shot

semantics

meta-learning

few-shot

probing

prompting

finetuning

semantic dependencies

pretrained transformer

efficient methods.

multi-task learning

6

presentations

12

number of views

SHORT BIO

Zhaofeng Wu is a PhD student at MIT CSAIL working with professor Yoon Kim. He was previously a predoctoral researcher at AI2. He obtained his M.S., B.S., and B.A. at the University of Washington, where he worked with the UW NLP group.

Presentations

Continued Pretraining for Better Zero- and Few-Shot Promptability

Zhaofeng Wu and 6 other authors

Modeling Context With Linear Attention for Scalable Document-Level Translation

Zhaofeng Wu

ABC: Attention with Bounded-memory Control

Hao Peng and 7 other authors

Understanding Mention Detector-Linker Interaction in Neural Coreference Resolution

Zhaofeng Wu and 1 other author

Infusing Finetuning with Semantic Dependencies

Zhaofeng Wu and 2 other authors

Modeling Context With Linear Attention for Scalable Document-Level Translation

Zhaofeng Wu and 3 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved