Lecture image placeholder

Premium content

Access to this content requires a subscription. You must be a premium user to view this content.

Monthly subscription - $9.99Pay per view - $4.99Access through your institutionLogin with Underline account
Need help?
Contact us
Lecture placeholder background
VIDEO DOI: https://doi.org/10.48448/tpk8-kp26

workshop paper

ACL 2024

August 16, 2024

Bangkok, Thailand

Team_Zero at StanceEval2024: Frozen PLMs for Arabic Stance Detection

keywords:

pretrained language models

natural language processing

stance detection

bert

This research explores the effectiveness of using pre-trained language models (PLMs) as feature extractors for Arabic stance detection on social media, focusing on topics like women empowerment, COVID-19 vaccination, and digital transformation. By leveraging sentence transformers to extract embeddings and incorporating aggregation architectures on top of BERT, we aim to achieve high performance without the computational expense of fine-tuning. Our approach demonstrates significant resource and time savings while maintaining competitive performance, scoring an F1-score of 78.62 on the test set. This study highlights the potential of PLMs in enhancing stance detection in Arabic social media analysis, offering a resource-efficient alternative to traditional fine-tuning methods.

Downloads

Transcript English (automatic)

Next from ACL 2024

ANLP RG at StanceEval2024: Comparative Evaluation of Stance, Sentiment and Sarcasm Detection
workshop paper

ANLP RG at StanceEval2024: Comparative Evaluation of Stance, Sentiment and Sarcasm Detection

ACL 2024

Mezghani Amal and 2 other authors

16 August 2024

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved