
Premium content
Access to this content requires a subscription. You must be a premium user to view this content.

workshop paper
DRU at WojoodNER 2024: A Multi-level Method Approach
keywords:
binary cross-entropy with logits loss
fine-grained entity recognition
multi-label token classification
gemma
wojood shared task 2024
wojoodfine
wojood
arabic ner
bloom
arabic nlp
token classification
arabert
natural language processing
named entity recognition
nlp
ner
bert
In this paper, we present our submission for the WojoodNER 2024 Shared Tasks addressing flat and nested sub-tasks (1, 2). We experiment with three different approaches. We train (i) an Arabic fine-tuned version of BLOOMZ-7b-mt, GEMMA-7b, and AraBERTv2 on multi-label token classifications task; (ii) two AraBERTv2 models, on main types and sub-types respectively; and (iii) one model for main types and four for the four sub-types. Based on the Wojood NER 2024 test set results, the three fine-tuned models performed similarly with AraBERTv2 favored (F1: Flat=.8780 Nested=.9040). The five model approach performed slightly better (F1: Flat=.8782 Nested=.9043).