profile picture

Sangmin Bae

Ph.D Student @ KAIST AI / Google DeepMind

continual learning

large language models

inference acceleration

large language model

evaluation benchmark

cv: representation learning for vision

ml: representation learning

ml: unsupervised & self-supervised learning

early-exiting

3

presentations

SHORT BIO

I am a Research Scientist with a strong desire to become a versatile T-shaped expert in AI. While I have primarily focused on Computer Vision, I have also explored other AI domains, including NLP, Audio, Tabular, and Video, to broaden my knowledge and expertise. My research interests lie in Efficient AI, which entails exploring training- or data-efficient approaches to make AI more accessible and sustainable. Some of my research areas include Self-Supervised Learning, Federated Learning, Generative AI, and Multimodal Learning.

Presentations

Carpe diem: On the Evaluation of World Knowledge in Lifelong Language Models

Yujin Kim and 6 other authors

Fast and Robust Early-Exiting Framework for Autoregressive Language Models with Synchronized Parallel Decoding

Sangmin Bae and 3 other authors

Self-Contrastive Learning: Single-viewed Supervised Contrastive Framework using Sub-network

Sangmin Bae and 5 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved