profile picture

Yuxin Jiang

PhD student @ HKUST

large language models

instruction following

multi-level

alignment

chatgpt

knowledge editing

sentence level relations

fine-grained constraints

instruction-following benchmark

knowledge distillation

accessibility; image text matching; counterfactual/contrastive explanations; human-centered evaluation

5

presentations

SHORT BIO

I am a 3rd year Ph.D. candidate at the Hong Kong University of Science and Technology. Supervised by Prof. Wei Wang, I am working on Natural Language Processing (NLP) as well as its applications. Previously, I received a B.S. in Mathematics and Applied Mathematics from Shanghai University (advised by Prof. Qingwen Wang) in 2020 and a M.S. in Big Data and Technology (advised by Prof. Fangzhen Lin) from the Hong Kong University of Science and Technology in 2021.

Presentations

MT-Eval: A Multi-Turn Capabilities Evaluation Benchmark for Large Language Models

Wai Chung Kwan and 8 other authors

FollowBench: A Multi-level Fine-grained Constraints Following Benchmark for Large Language Models

Yuxin Jiang and 9 other authors

Learning to Edit: Aligning LLMs with Knowledge Editing

Yuxin Jiang and 11 other authors

Exploring the Potential of ChatGPT on Sentence Level Relations: A Focus on Temporal, Causal, and Discourse Relations

Chunkit Chan and 6 other authors

Lion: Adversarial Distillation of Proprietary Large Language Models | VIDEO

Yuxin Jiang and 3 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved