profile picture

Tu Vu

Doctoral student @ University of Massachusets Amherst

prompt tuning

question answering

large language models

soft prompt transfer

parameter-efficient methods

deberta

parameter-efficient tuning

generative data augmentation

vietnamese pre-trained language model

data augmentation

zero-shot cross-lingual generation

4

presentations

8

number of views

SHORT BIO

I am a Ph.D. candidate in the College of Information and Computer Sciences (CICS) at University of Massachusetts Amherst (UMass Amherst), where I work with Professor Mohit Iyyer in the UMass Natural Language Processing group (UMass NLP). Currently, I also spend one day a week as a student researcher at Google Brain.

Presentations

ViDeBERTa: A powerful pre-trained language model for Vietnamese

Cong Dao Tran and 4 other authors

Leveraging QA Datasets to Improve Generative Data Augmentation

Dheeraj Mekala and 3 other authors

SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer

Daniel Cer and 4 other authors

STraTA: Self-Training with Task Augmentation for Better Few-shot Learning

Tu Vu and 4 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved