Lecture image placeholder

Premium content

Access to this content requires a subscription. You must be a premium user to view this content.

Monthly subscription - $9.99Pay per view - $4.99Access through your institutionLogin with Underline account
Need help?
Contact us
Lecture placeholder background
VIDEO DOI: https://doi.org/10.48448/v6t2-f665

workshop paper

ACL 2024

August 16, 2024

Bangkok, Thailand

ASOS at KSAA-CAD 2024: One Embedding is All You Need for Your Dictionary

keywords:

llm

arabic

nlp

deep learning

bert

Semantic search tasks have grown extremely fast following the advancements in large language models, including the Reverse Dictionary and Word Sense Disambiguation in Arabic. This paper describes our participation in the Contemporary Arabic Dictionary Shared Task. We propose two models that achieved first place in both tasks. We conducted comprehensive experiments on the latest five multilingual sentence transformers and the Arabic BERT model for semantic embedding extraction. We achieved a ranking score of 0.06 for the reverse dictionary task, which is double than last year's winner. We had an accuracy score of 0.268 for the Word Sense Disambiguation task.

Downloads

SlidesTranscript English (automatic)

Next from ACL 2024

Baleegh at KSAA-CAD 2024: Towards Enhancing Arabic Reverse Dictionaries
workshop paper

Baleegh at KSAA-CAD 2024: Towards Enhancing Arabic Reverse Dictionaries

ACL 2024

Mais Alheraki and 1 other author

16 August 2024

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved