
Zixiang Ding
Researcher @ Meituan
model compression
text classification
learning & optimization for snlp
dynamic knowledge distillation
multi-teacher distillation
one-to-one distillation manner
2
presentations
SHORT BIO
Zixiang Ding received the Ph.D. degree from Institute of Automation, Chinese Academy of Sciences in 2022. He is currently working at Meituan, Beijing, China. His research interests include large language model, model compression, nature language processing, computer vision, neural architecture search and deep reinforcement learning.
Presentations

How to Trade Off the Quantity and Capacity of Teacher Ensemble: Learning Categorical Distribution to Stochastically Employ a Teacher for Distillation
Zixiang Ding and 4 other authors

SKDBERT: Compressing BERT via Stochastic Knowledge Distillation
Zixiang Ding and 4 other authors