
Olga Kovaleva
University of Massachusetts Lowell
transformers
pruning
layernorm
1
presentations
SHORT BIO
Olga Kovaleva is a 4th year PhD student at University of Massachusetts Lowell. Her research work primarily includes interpretability studies of Transformer models in NLP, with a particular focus on BERT. In the past, Olga has also worked on applications of NLP to the biomedical domain and gained expertise in computer vision techniques while interning at IBM Research. After graduation, Olga intends to join Facebook as a Machine Learning engineer.
Presentations

BERT Busters: Outlier Dimensions that Disrupt Transformers
Olga Kovaleva and 3 other authors