profile picture

Jeonghwan Kim

UIUC, Computer Science

dataset

large language models

retrieval-augmented language models

concept simplification

domain-specific simplification

3

presentations

SHORT BIO

My primary research interests lie in the intersection of natural language processing (NLP) and machine learning. I aspire to build a general-purpose NLP system that overcomes the boundaries of the human-level linguistic capabilities and disseminate knowledge worldwide. I strongly believe that my research in NLP is going to contribute to the development of such a system, and I will orient my research towards accomplishing the goal by solving untackled challenges in the field.

To be specific, I'm particularly interested in developing neuro-symbolic language models that can process symbols that possess certain characteristics (e.g., numbers), and understand those symbols in the light of the given context. These days, I'm also interested in the interaction between the parametric and non-parametric knowledge of the pre-trained LMs.

Presentations

Finer: Investigating and Enhancing Fine-Grained Visual Concept Recognition in Large Vision Language Models

Jeonghwan Kim and 1 other author

Why So Gullible? Enhancing the Robustness of Retrieval-Augmented Models against Counterfactual Noise

Giwon Hong and 4 other authors

Exploiting Numerical-Contextual Knowledge to Improve Numerical Reasoning in Question Answering

Jeonghwan Kim and 4 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved