profile picture

Fei Huang

fine-tuning

pre-training

benchmarking

language modeling

prompting

pre-training model

non-autoregressive text generation

efficient text generation

datasets for low resource languages

2

presentations

SHORT BIO

Fei Huang is a PhD candidate in the Department of Computer Science at Tsinghua University. His research focuses on natural language generation, particularly non-autoregressive text generation models. He has authored 10 papers that have been published in top-tier conferences or journals such as ICML, ACL, EMNLP, and TACL.

Presentations

Predicting Rewards Alongside Tokens: Non-disruptive Parameter Insertion for Efficient Inference Intervention in Large Language Model

Chenhan Yuan and 6 other authors

Directed Acyclic Transformer Pre-training for High-quality Non-autoregressive Text Generation

Fei Huang and 2 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved