VIDEO DOI: https://doi.org/10.48448/jtyp-3q56

findings / work in progress

EMNLP 2022

Abu Dhabi, United Arab Emirates

Scaling Laws Under the Microscope: Predicting Transformer Performance from Small Scale Experiments

Please log in to leave a comment

Downloads

PaperTranscript English (automatic)

Next from EMNLP 2022

AlphaTuning: Quantization-Aware Parameter-Efficient Adaptation of Large-Scale Pre-Trained Language Models
findings / work in progress

AlphaTuning: Quantization-Aware Parameter-Efficient Adaptation of Large-Scale Pre-Trained Language Models

EMNLP 2022

Se Jung Kwon
Se Jung Kwon

30 November 2022

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved