Dian Yu
Graduate student @ UC Davis
transfer learning
multilingual
language representation
6
presentations
3
number of views
1
citations
SHORT BIO
Dian Yu is a final-year PhD student at UC Davis and a research scientist at Google Brain. His research thesis is attribute representation with neural language models, where he presents methods to control pre-trained models for NLU and NLG tasks more efficiently. He has been mainly focusing on dialog research including open-domain and task-oriented dialog. He is also interested in multi-lingual and multi-modal research.
OTHER AFFILIATIONS
Presentations
Automatically Exposing Problems with Neural Dialog Models
Dian Yu and 1 other author
Automatically Exposing Problems with Neural Dialog Models
Dian Yu and 1 other author
Attribute Alignment: Controlling Text Generation from Pre-trained Language Models
Dian Yu and 2 other authors
Language Embeddings for Typology and Cross-lingual Transfer Learning
Dian Yu and 2 other authors
Few-shot Intent Classification and Slot Filling with Retrieved Examples
Dian Yu and 5 other authors