
Young Jin Kim
Principal Researcher @ Microsoft
machine translation
inference
optimization
style transfer
quantization
parameter-efficient fine-tuning
mixture of experts
offsite-tuning
2
presentations
1
number of views
1
citations
SHORT BIO
Young Jin Kim is a Principal Researcher at the Microsoft where he develops machine learning models with state-of-the-art techniques. His recent research focus includes designing efficient and effective algorithms and model architectures for large scale language models. Young received his Ph.D. from Georgia Institute of Technology for his research in deep learning and high-performance computing.
Presentations

PEMA: An Offsite-Tunable Plug-in External Memory Adaptation for Language Models
Hyun Jin Kim and 2 other authors

Who Says Elephants Can't Run: Bringing Large Scale MoE Models into Cloud Scale Production
Young Jin Kim