Lecture image placeholder

Premium content

Access to this content requires a subscription. You must be a premium user to view this content.

Monthly subscription - $9.99Pay per view - $4.99Access through your institutionLogin with Underline account
Need help?
Contact us
Lecture placeholder background
VIDEO DOI: https://doi.org/10.48448/0s8c-6470

workshop paper

ACL 2024

August 15, 2024

Bangkok, Thailand

Dolomites@#SMM4H 2024 Helping LLMs "Know The Drill" in Low-Resource Settings A Study on Social Media Posts

keywords:

mtl-da

entity recognition

large language model

classification

data augmentation

information extraction

The amount of data to fine-tune LLMs plays a crucial role in the performance of these models in downstream tasks. Consequently, it is not straightforward to deploy these models in lowresource settings. In this work, we investigate two new multi-task learning data augmentation approaches for fine-tuning LLMs when little data is available: "In-domain Augmentation" of the training data and extracting "Drills" as smaller tasks from the target dataset. We evaluate the proposed approaches in three natural language processing settings in the context of SMM4H 2024 competition tasks: multi-class classification, entity recognition, and information extraction. The results show that both techniques improve the performance of the models in all three settings, suggesting a positive impact from the knowledge learned in multi-task training to perform the target task.

Downloads

Transcript English (automatic)

Next from ACL 2024

RIGA at SMM4H-2024 Task 1: Enhancing ADE discovery with GPT-4
workshop paper

RIGA at SMM4H-2024 Task 1: Enhancing ADE discovery with GPT-4

ACL 2024

Eduards Mukans and 1 other author

15 August 2024

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved