VIDEO DOI: https://doi.org/10.48448/jhas-7374

technical paper

EMNLP 2021

November 08, 2021

Live on Underline

How to Select One Among All ? An Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding

Please log in to leave a comment

Downloads

SlidesTranscript English (automatic)

Next from EMNLP 2021

BERT might be Overkill: A Tiny but Effective Biomedical Entity Linker based on Residual Convolutional Neural Networks
technical paper

BERT might be Overkill: A Tiny but Effective Biomedical Entity Linker based on Residual Convolutional Neural Networks

EMNLP 2021

ChengXiang ZhaiHeng JiTuan Lai
Tuan Lai and 2 other authors

08 November 2021

Similar lecture

To optimize, or not to optimize, that is the question: TelU-KU models for WMT21 Large-Scale Multilingual Machine Translation
workshop paper

To optimize, or not to optimize, that is the question: TelU-KU models for WMT21 Large-Scale Multilingual Machine Translation

EMNLP 2021

+4Sari Dewi Budiwati
Sari Dewi Budiwati and 6 other authors

08 November 2021

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved