VIDEO DOI: https://doi.org/10.48448/c7g3-yp95

workshop paper

EMNLP 2021

November 08, 2021

Live on Underline

Is BERT a Cross-Disciplinary Knowledge Learner? A Surprising Finding of Pre-trained Models’ Transferability

Please log in to leave a comment

Downloads

Transcript English (automatic)

Next from EMNLP 2021

Finetuning Pretrained Transformers into RNNs
technical paper

Finetuning Pretrained Transformers into RNNs

EMNLP 2021

+6Gabriel IlharcoJungo Kasai
Jungo Kasai and 8 other authors

08 November 2021

Similar lecture

CUNI systems for WMT21: Multilingual Low-Resource Translation for Indo-European Languages Shared Task
workshop paper

CUNI systems for WMT21: Multilingual Low-Resource Translation for Indo-European Languages Shared Task

EMNLP 2021

+1Josef Jon
Josef Jon and 3 other authors

08 November 2021

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved