UNDERLINE DOI: https://doi.org/10.48448/bazd-z165
findings / work in progress
Sparsifying Transformer Models with Trainable Representation Pooling
Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.