profile picture

Zewen Chi

pre-training

cross-lingual

language model

large language models

multimodality

cross-lingual language model

word alignment

phrase

modular deep learning

multilingual transformer

cross-lingual transfer

protein understanding

retrieval

7

presentations

2

number of views

SHORT BIO

I am a PhD student at Beijing Institute of Technology advised by Heyan Huang. I interned at Microsoft Research Asia, where I was advised by Li Dong and Furu Wei, and also interned at ByteDance AI Lab.

Presentations

ProtLLM: An Interleaved Protein-Language LLM with Protein-as-Word Pre-Training

Le Zhuo and 8 other authors

Can Cross-Lingual Transferability of Multilingual Transformers Be Activated Without End-Task Data?

Zewen Chi

XLM-E: Cross-lingual Language Model Pre-training via ELECTRA

Zewen Chi and 10 other authors

mT6: Multilingual Pretrained Text-to-Text Transformer with Translation Pairs

Zewen Chi

mT6: Multilingual Pretrained Text-to-Text Transformer with Translation Pairs

Zewen Chi

Improving Pretrained Cross-Lingual Language Models via Self-Labeled Word Alignment

Zewen Chi

InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training

Zewen Chi

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved