technical paper

NAACL 2022

July 11, 2022

Seattle, United States

KroneckerBERT: Significant Compression of Pre-trained Language Models Through Kronecker Decomposition and Knowledge Distillation

Please log in to leave a comment


PaperTranscript English (automatic)

Next from NAACL 2022

technical paper

Stylized Knowledge-Grounded Dialogue Generation via Disentangled Template Rewriting

NAACL 2022

Qingfeng Sun

11 July 2022

Similar lecture


Compressing Large-Scale Transformer-Based Models: A Case Study on BERT

ACL 2022

Prakhar Ganesh and 8 other authors

23 May 2022

Stay up to date with the latest Underline news!


  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved