
Premium content
Access to this content requires a subscription. You must be a premium user to view this content.

workshop paper
Pirates at ArabicNLU2024: Enhancing Arabic Word Sense Disambiguation using Transformer-Based Approaches
keywords:
arabertv2
arabic word sense disambiguation
sentence transformer
few shot learning
contrastive learning
This paper presents a novel approach to Ara- bic Word Sense Disambiguation (WSD) lever- aging transformer-based models to tackle the complexities of the Arabic language. Utiliz- ing the SALMA dataset, we applied several techniques, including Sentence Transformers with Siamese networks and the SetFit frame- work optimized for few-shot learning. Our ex- periments, structured around a robust evalua- tion framework, achieved a promising F1-score of up to 71%, securing second place in the ArabicNLU 2024: The First Arabic Natural Language Understanding Shared Task compe- tition. These results demonstrate the efficacy of our approach, especially in dealing with the challenges posed by homophones, homographs, and the lack of diacritics in Arabic texts. The proposed methods significantly outperformed traditional WSD techniques, highlighting their potential to enhance the accuracy of Arabic natural language processing applications.