
Tu Vu
Doctoral student @ University of Massachusets Amherst
prompt tuning
question answering
large language models
soft prompt transfer
parameter-efficient methods
deberta
parameter-efficient tuning
generative data augmentation
vietnamese pre-trained language model
data augmentation
zero-shot cross-lingual generation
4
presentations
8
number of views
SHORT BIO
I am a Ph.D. candidate in the College of Information and Computer Sciences (CICS) at University of Massachusetts Amherst (UMass Amherst), where I work with Professor Mohit Iyyer in the UMass Natural Language Processing group (UMass NLP). Currently, I also spend one day a week as a student researcher at Google Brain.
Presentations

ViDeBERTa: A powerful pre-trained language model for Vietnamese
Cong Dao Tran and 4 other authors

Leveraging QA Datasets to Improve Generative Data Augmentation
Dheeraj Mekala and 3 other authors

SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer
Daniel Cer and 4 other authors

STraTA: Self-Training with Task Augmentation for Better Few-shot Learning
Tu Vu and 4 other authors