
Hyung Won Chung
language models
pretraining
efficiency
scaling laws
emergent abilities
5
presentations
3
number of views
SHORT BIO
Hyung Won Chung is a software engineer at Google Brain. His research focuses on various aspects of language model pre-training including giant language models and multilinguality.
Presentations

Transcending Scaling Laws with 0.1% Extra Compute
Yi Tay and 15 other authors

A Simple and Effective Positional Encoding for Transformers
Pu-Chin Chen and 5 other authors

Learning Compact Metrics for MT
Amy Pu and 4 other authors

Learning Compact Metrics for MT
Amy Pu and 4 other authors

Do Transformer Modifications Transfer Across Implementations and Applications?
Hyung Won Chung and 1 other author