profile picture

Michael Hassid

Graduate student @ Hebrew University of Jerusalem

transformers

attention

analysis

early exit

machine learning

survey

efficiency

green ai

dialects

adaptive inference

low-resource

cross-lingual generalization

language models

5

presentations

1

number of views

SHORT BIO

PhD candidate at The Hebrew University of Jerusalem, Israel, under the supervision of Dr. Roy Shwartz.

Presentations

Transformers are Multi-State RNNs

Matanel Oren and 4 other authors

Finding the SWEET Spot: Analysis and Improvement of Adaptive Inference in Low Resource Settings

Daniel Rotem and 3 other authors

Efficient Methods for Natural Language Processing: A Survey

Marcos Treviso and 15 other authors

How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers

Michael Hassid

How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers

Michael Hassid

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved