profile picture

Jue WANG

inference acceleration

classification

continual learning

knowledge distillation

pretrained language model

large language model

pre-trained language model

4

presentations

4

number of views

SHORT BIO

Hello, I am a PhD student in Data Intelligence Lab of Zhejiang University, advised by Prof. Lidan Shou. My current research interests lie in Distributed Systems and Efficient Algorithms for NLP (both training and inference). I am also interested in NLP in a broad sense, e.g. Information Extraction and NLP in low-resource scenarios.

Presentations

Draft & Verify: Lossless Large Language Model Acceleration via Self-Speculative Decoding

Jun Zhang and 6 other authors

Effective Continual Learning for Text Classification with Lightweight Snapshots

Jue WANG and 4 other authors

SkipBERT: Efficient Inference with Shallow Layer Skipping

Jue WANG and 4 other authors

Effective Slot Filling via Weakly-Supervised Dual-Model Learning

Jue WANG and 4 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved