profile picture

Aref Jafari

researcher @ Huawei, Huawei Noah’s Ark Lab

knowledge distillation

natural language processing

efficient training

regularization

llms

capacity gap

continuation optimization

hinge loss

annealing knowledge distillation

retrival-based llms

5

presentations

19

number of views

SHORT BIO

Aref Jafari is a Ph.D. student in the school of computer science department at the University of Waterloo. He did his bachelor's and first master's degrees in computer science and his second master's degree in computational math. Also, he is a research intern at the NLP team of Huawei technologies Canada co. ltd. His research interests are the theory of machine learning and deep learning, optimization, and natural language processing.

Presentations

Efficient Citer: Tuning Large Language Models for Enhanced Answer Quality and Verification

Marzieh Tahaei and 8 other authors

Do we need Label Regularization to Fine-tune Pre-trained Language Models?

Ivan Kobyzev and 7 other authors

Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher

Aref Jafari

How to Select One Among All ? An Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding

Tianda Li and 5 other authors

Continuation KD: Improved Knowledge Distillation through the Lens of Continuation Optimization

Aref Jafari and 4 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved