profile picture

Ivan Vulić

University of Cambridge

cross-lingual transfer

multilingual

multilingual nlp

multilinguality

cross-lingual nlp

bilingual lexicon induction

lexical semantics

adapters

large language models

few-shot learning

transfer learning

pretrained language models

knowledge injection

contrastive learning

parameter-efficient fine-tuning

52

presentations

60

number of views

SHORT BIO

Ivan Vulić is a Principal Research Associate (equivalent to Associate Professor) and a Royal Society University Research Fellow in the Language Technology Lab, University of Cambridge. He is also a Senior Scientist at PolyAI. He is a member of the Steering Committee of the newly established Centre for Human Inspired Artificial Intelligence (CHIA) at Cambridge. Ivan holds a PhD in Computer Science from KU Leuven awarded summa cum laude. In 2021 he was awarded the annual Karen Spärck Jones Award from the British Computing Society for his research contributions to NLP and Information Retrieval. His core expertise is in representation learning, cross-lingual learning, conversational AI, human language understanding, distributional, lexical, multi-modal, and knowledge-enhanced semantics in monolingual and multilingual contexts, transfer learning for enabling cross-lingual NLP applications such as conversational AI in low-resource languages, and machine learning for (cross-lingual and multilingual) NLP. He has published numerous papers at top-tier NLP and Information Retrieval conferences and journals, and his research work also resulted in several best paper awards. He serves as an area chair and regularly reviews for all major NLP and Machine Learning conferences and journals. Ivan has given numerous invited talks at academia and industry, and co-organised a number of NLP conferences and workshops.

Presentations

On Bilingual Lexicon Induction with Large Language Models

Yaoyiran Li and 2 other authors

Quantifying the Dialect Gap and its Correlates Across Languages

Anjali Kantharuban and 2 other authors

Survival of the Most Influential Prompts: Efficient Black-Box Prompt Search via Clustering and Pruning

Han Zhou and 3 other authors

CompoundPiece: Evaluating and Improving Decompounding Performance of Language Models

Benjamin Minixhofer and 2 other authors

A Systematic Study of Performance Disparities in Multilingual Task-Oriented Dialogue Systems

Songbo Hu and 7 other authors

Unifying Cross-Lingual Transfer across Scenarios of Resource Scarcity | VIDEO

Alan Ansell and 4 other authors

Multi3NLU++: A Multilingual, Multi-Intent, Multi-Domain Dataset for Natural Language Understanding in Task-Oriented Dialogue

Nikita Moghe and 5 other authors

Free Lunch: Robust Cross-Lingual Transfer via Model Checkpoint Averaging

Fabian David Schmidt and 2 other authors

Where's the Point? Self-Supervised Multilingual Punctuation-Agnostic Sentence Segmentation

Benjamin Minixhofer and 2 other authors

Translation-Enhanced Multilingual Text-to-Image Generation

Yaoyiran Li and 4 other authors

Multi3NLU++: A Multilingual, Multi-Intent, Multi-Domain Dataset for Natural Language Understanding in Task-Oriented Dialogue

Nikita Moghe and 5 other authors

Can Pretrained Language Models (Yet) Reason Deductively?

Songbo Hu and 4 other authors

Probing Cross-Lingual Lexical Knowledge from Multilingual Sentence Encoders

Ivan Vulić and 5 other authors

SLICER: Sliced Fine-Tuning for Low-Resource Cross-Lingual Transfer for Named Entity Recognition

Fabian David Schmidt and 2 other authors

Don't Stop Fine-Tuning: On Training Regimes for Few-Shot Cross-Lingual Transfer with Multilingual Language Models

Fabian David Schmidt and 2 other authors

Modular and Parameter-Efficient Fine-Tuning for NLP Models

Sebastian Ruder and 2 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved