profile picture

Jiaxin Bai

Ph.D. Student @ HKUST

reinforcement learning

knowledge distillation

commonsense

reasoning

knowledge graph

benchmarking

large language models

logical reasoning

abductive reasoning

conceptualization

evaluation methodologies

instantiation

evidence conflicts

financial/business nlp

factual conflicts

5

presentations

SHORT BIO

I am a Ph.D. student from the CSE department of Hong Kong University of Science and Technology starting in the fall of 2020, supervised by Professor Yangqiu Song. My research interest is currently in natural language processing and knowledge graph reasoning.

Presentations

MIND: Multimodal Shopping Intention Distillation from Large Vision-language Models for E-commerce Purchase Understanding

Baixuan Xu and 14 other authors

IntentionQA: A Benchmark for Evaluating Purchase Intention Comprehension Abilities of Language Models in E-commerce

Wenxuan Ding and 13 other authors

CANDLE: Iterative Conceptualization and Instantiation Distillation from Large Language Models for Commonsense Reasoning

Weiqi Wang and 11 other authors

Advancing Abductive Reasoning in Knowledge Graphs through Complex Logical Hypothesis Generation

Jiaxin Bai and 5 other authors

Query2Particles: Knowledge Graph Reasoning with Particle Embeddings

Jiaxin Bai and 3 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved