profile picture

Olga Kovaleva

University of Massachusetts Lowell

transformers

pruning

layernorm

1

presentations

SHORT BIO

Olga Kovaleva is a 4th year PhD student at University of Massachusetts Lowell. Her research work primarily includes interpretability studies of Transformer models in NLP, with a particular focus on BERT. In the past, Olga has also worked on applications of NLP to the biomedical domain and gained expertise in computer vision techniques while interning at IBM Research. After graduation, Olga intends to join Facebook as a Machine Learning engineer.

Presentations

BERT Busters: Outlier Dimensions that Disrupt Transformers

Olga Kovaleva and 3 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved