
Rabeeh Karimi mahabadi
EPFL
multi-task learning
generation
few-shot learning
pretrained language models
efficient fine-tuning
diffusion
adapters
hyper networks
parameter-efficient fine-tuning
language modelling
3
presentations
12
number of views
SHORT BIO
I am a PhD student in NLP at EPFL/Idiap research institute. I obtained my master at ETH Zurich in machine learning and computer vision. I am interested in transfer learning and training large-scale language models efficiently with less memory and training time.
Presentations

TESS: Text-to-Text Self-Conditioned Simplex Diffusion
Rabeeh Karimi mahabadi and 6 other authors

Prompt-free and Efficient Few-shot Learning with Language Models
Rabeeh Karimi mahabadi and 6 other authors

Parameter-efficient Multi-task Fine-tuning for Transformers via Shared Hypernetworks
Rabeeh Karimi mahabadi and 3 other authors