Lecture image placeholder

Premium content

Access to this content requires a subscription. You must be a premium user to view this content.

Monthly subscription - $9.99Pay per view - $4.99Access through your institutionLogin with Underline account
Need help?
Contact us
Lecture placeholder background
VIDEO DOI: https://doi.org/10.48448/gwf5-qm41

poster

ACL 2024

August 22, 2024

Bangkok, Thailand

Are Decoder-Only Language Models Better than Encoder-Only Language Models in Understanding Word Meaning?

keywords:

word-in-context

word sense disambiguation

large language models

The natural language processing field has been evolving around language models for the past few years, from the usage of n-gram language models for re-ranking, to transfer learning with encoder-only (BERT-like) language models, and finally to large language models (LLMs) as general solvers. LLMs are dominated by the decoder-only type, and they are popular for their efficacy in numerous tasks. LLMs are regarded as having strong comprehension abilities and strong capabilities to solve new unseen tasks. As such, people may quickly assume that decoder-only LLMs always perform better than the encoder-only ones, especially for understanding word meaning. In this paper, we demonstrate that decoder-only LLMs perform worse on word meaning comprehension than an encoder-only language model that has vastly fewer parameters.

Downloads

Transcript English (automatic)

Next from ACL 2024

CritiqueLLM: Towards an Informative Critique Generation Model for Evaluation of Large Language Model Generation
poster

CritiqueLLM: Towards an Informative Critique Generation Model for Evaluation of Large Language Model Generation

ACL 2024

+9Pei Ke
Pei Ke and 11 other authors

22 August 2024

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved