profile picture

Donghan Yu

knowledge graph

language model

open-domain question answering

language model pre-training

dictionary

dictionary-enhanced language model

3

presentations

12

number of views

SHORT BIO

Hi! I am Donghan Yu, a fourth-year Ph.D. student in Language Technologies Institute, Carnegie Mellon University. I am fortunately advised by Prof. Yiming Yang. Before that, I received my B.S. in Electronic Engineering from Tsinghua University, advised by Prof. Yong Li. My research interests include knowledge-enhanced NLP, knowledge graph modeling and graph neural networks. My recent research experience covers (1) joint pretraining on knowledge graphs and texts, (2) knowledge graph enhanced open-domain question answering, (3) graph convolutional networks for knowledge graph modeling.

Presentations

Dict-BERT: Enhancing Language Model Pre-training with Dictionary

Wenhao Yu and 7 other authors

KG-FiD: Infusing Knowledge Graph in Fusion-in-Decoder for Open-Domain Question Answering

Donghan Yu and 8 other authors

JAKET: Joint Pre-Training of Knowledge Graph and Language Understanding

Donghan Yu and 3 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved