
Joshua Ainslie
Software Engineer @ Google
transformers
compositional generalization
efficient nlp
residual attention
nlp
entity extraction
multimodal data
architecture
cv
structure annotation
post / pre layer norm
vq-vae
retrieval augmented
conditional computation
llm
10
presentations
25
number of views
SHORT BIO
Joshua Ainslie is a software engineer at Google Research who works on improved efficiency and quality for Transformer models.
Presentations

MEMORY-VQ: Compression for Tractable Internet-Scale Memory
Yury Zemlyanskiy and 6 other authors

CoLT5: Faster Long-Range Transformers with Conditional Computation | VIDEO
Joshua Ainslie and 11 other authors

GQA: Training Generalized Multi-Query Transformer Models from Multi-Head Checkpoints
Joshua Ainslie and 5 other authors

A Suite of Generative Tasks for Multi-Level Multimodal Webpage Understanding
Andrea Burns and 7 other authors

Making Transformers Solve Compositional Tasks
Santiago Ontanon and 3 other authors

FormNet: Structural Encoding beyond Sequential Modeling in Form Document Information Extraction
Chen-Yu Lee and 9 other authors

RealFormer: Transformer Likes Residual Attention
Ruining He and 3 other authors

RealFormer: Transformer Likes Residual Attention
Anirudh Ravula and 3 other authors

ReadTwice: Reading Very Large Documents with Memories
Yury Zemlyanskiy and 5 other authors

Sparse Mixers: Combining MoE and Mixing to build a more efficient BERT
James Lee-Thorp and 1 other author