profile picture

Chen Liang

Doctoral student @ Georgia Insitute of Technology

multi-task learning

pruning

generalization

consistency regularization

ensemble learning

compression

lottery ticket hypothesis

pre-trained model fine-tuning

perturbation

collaborative distillation

pre-trained language models fine-tuning

model generalization

3

presentations

4

number of views

1

citations

SHORT BIO

I am a second year student in the Machine Learning Ph.D Program at Georgia Institute of Technology (Georgia Tech). I am working with Prof. Tuo Zhao in the FLASH (Foundations of LeArning Systems for alcHemy) research group. I received my M.S degree in Computational Science & Engineering from Georgia Tech, and received my B.S degree in Electrical Engineering from University of Southern California (USC). My undergrad advisor is Prof. C.-C Jay Kuo.

I am generally interested in machine learning for natural language processing. My research mainly focuses on developing methodologies and algorithms to improve parameter efficiency and model generalization of large-scale language models. My interests also include transfer learning and representation learning (e.g., multi-domain and multi-task learning).

Presentations

CAMERO: Consistency Regularized Ensemble of Perturbed Language Models with Weight Sharing

Chen Liang and 4 other authors

Token-wise Curriculum Learning for Neural Machine Translation

Chen Liang and 6 other authors

Super Tickets in Pre-Trained Language Models: From Model Compression to Improving Generalization

Chen Liang and 7 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved