profile picture

Zixiang Ding

Researcher @ Meituan

model compression

text classification

learning & optimization for snlp

dynamic knowledge distillation

multi-teacher distillation

one-to-one distillation manner

2

presentations

SHORT BIO

Zixiang Ding received the Ph.D. degree from Institute of Automation, Chinese Academy of Sciences in 2022. He is currently working at Meituan, Beijing, China. His research interests include large language model, model compression, nature language processing, computer vision, neural architecture search and deep reinforcement learning.

Presentations

How to Trade Off the Quantity and Capacity of Teacher Ensemble: Learning Categorical Distribution to Stochastically Employ a Teacher for Distillation

Zixiang Ding and 4 other authors

SKDBERT: Compressing BERT via Stochastic Knowledge Distillation

Zixiang Ding and 4 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved