
Jue WANG
inference acceleration
classification
continual learning
knowledge distillation
pretrained language model
large language model
pre-trained language model
4
presentations
4
number of views
SHORT BIO
Hello, I am a PhD student in Data Intelligence Lab of Zhejiang University, advised by Prof. Lidan Shou. My current research interests lie in Distributed Systems and Efficient Algorithms for NLP (both training and inference). I am also interested in NLP in a broad sense, e.g. Information Extraction and NLP in low-resource scenarios.
Presentations

Draft & Verify: Lossless Large Language Model Acceleration via Self-Speculative Decoding
Jun Zhang and 6 other authors

Effective Continual Learning for Text Classification with Lightweight Snapshots
Jue WANG and 4 other authors

SkipBERT: Efficient Inference with Shallow Layer Skipping
Jue WANG and 4 other authors

Effective Slot Filling via Weakly-Supervised Dual-Model Learning
Jue WANG and 4 other authors