profile picture

Qihuang Zhong

Student @ Wuhan University

self-evolution learning

zero-shot

knowledge distillation

efficient training

quantization

minimax optimization

mixup

language model pretraining

adaptive training

few-shot text classification

autoregressive language model

sequence-to-sequence learning

transformer-based model pretraining

token dropping

token-specific label smoothing

6

presentations

2

number of views

SHORT BIO

I am currently pursuing a Ph.D. degree in Artificial Intelligence from the School of Computer Science, Wuhan University. My research interests include language model pretraining, natural language understanding and generation. I have authored or co-authored several papers at top conferences and international journals, including IEEE TKDE, ACL, EMNLP, COLING and etc. I won the general language understanding (GLUE) and more difficult language understanding (SuperGLUE) challenges.

Presentations

Revisiting Knowledge Distillation for Autoregressive Language Models

Qihuang Zhong and 5 other authors

Zero-shot Sharpness-Aware Quantization for Pre-trained Language Models

Miaoxi Zhu and 6 other authors

Self-Evolution Learning for Mixup: Enhance Data Augmentation on Few-Shot Text Classification Tasks

Haoqi Zheng and 7 other authors

Self-Evolution Learning for Discriminative Language Model Pretraining

Qihuang Zhong and 4 other authors

Token-Level Self-Evolution Training for Sequence-to-Sequence Learning

Keqin Peng and 6 other authors

Revisiting Token Dropping Strategy in Efficient BERT Pretraining

Qihuang Zhong and 6 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved