
Aref Jafari
researcher @ Huawei, Huawei Noah’s Ark Lab
knowledge distillation
natural language processing
efficient training
regularization
llms
capacity gap
continuation optimization
hinge loss
annealing knowledge distillation
retrival-based llms
5
presentations
19
number of views
SHORT BIO
Aref Jafari is a Ph.D. student in the school of computer science department at the University of Waterloo. He did his bachelor's and first master's degrees in computer science and his second master's degree in computational math. Also, he is a research intern at the NLP team of Huawei technologies Canada co. ltd. His research interests are the theory of machine learning and deep learning, optimization, and natural language processing.
Presentations

Efficient Citer: Tuning Large Language Models for Enhanced Answer Quality and Verification
Marzieh Tahaei and 8 other authors

Do we need Label Regularization to Fine-tune Pre-trained Language Models?
Ivan Kobyzev and 7 other authors

Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher
Aref Jafari

How to Select One Among All ? An Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding
Tianda Li and 5 other authors

Continuation KD: Improved Knowledge Distillation through the Lens of Continuation Optimization
Aref Jafari and 4 other authors