
Michael Hassid
Graduate student @ Hebrew University of Jerusalem
transformers
attention
analysis
early exit
machine learning
survey
efficiency
green ai
dialects
adaptive inference
low-resource
cross-lingual generalization
language models
5
presentations
1
number of views
SHORT BIO
PhD candidate at The Hebrew University of Jerusalem, Israel, under the supervision of Dr. Roy Shwartz.
Presentations

Transformers are Multi-State RNNs
Matanel Oren and 4 other authors

Finding the SWEET Spot: Analysis and Improvement of Adaptive Inference in Low Resource Settings
Daniel Rotem and 3 other authors

Efficient Methods for Natural Language Processing: A Survey
Marcos Treviso and 15 other authors

How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers
Michael Hassid

How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers
Michael Hassid