profile picture

Minsoo Kim

PhD Student @ Hanyang University

quantization

transformer encoder

knowledge distillation

clip

llm

snlp

peft

lora

language models

rank

video understanding

multi-modal learning

3

presentations

SHORT BIO

I am fourth year at MS/Ph.D. integrated program in the Artificial Intelligence Hardware & Algorithm lab at Hanyang University. I am fortunate to be advised by Professor Jungwook Choi. My research interest lies in the area of Transformer based Model compression (Quantization, Knowledge Distillation) and LLM (generative language model, PEFT, visual-language model).

Presentations

RA-LoRA: Rank-Adaptive Parameter-Efficient Fine-Tuning for Accurate 2-bit Quantized Large Language Models

Minsoo Kim and 3 other authors

Teacher Intervention: Improving Convergence of Quantization Aware Training for Ultra-Low Precision Transformers

Minsoo Kim and 4 other authors

Understanding and Improving Knowledge Distillation for Quantization Aware Training of Large Transformer Encoders

Minsoo Kim

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved