Lecture image placeholder

Premium content

Access to this content requires a subscription. You must be a premium user to view this content.

Monthly subscription - $9.99Pay per view - $4.99Access through your institutionLogin with Underline account
Need help?
Contact us
Lecture placeholder background
VIDEO DOI: https://doi.org/10.48448/kr74-8y10

workshop paper

ACL 2024

August 15, 2024

Bangkok, Thailand

Adapting transformer models to morphological tagging of two highly inflectional languages: a case study on Ancient Greek and Latin

keywords:

inflectional languages

morphological tagging

ancient greek

latin

transformers

Natural language processing for Greek and Latin, inflectional languages with small corpora, requires special techniques. For morphological tagging, transformer models show promising potential, but the best approach to use these models is unclear. For both languages, this paper examines the impact of using morphological lexica, training different model types (a single model with a combined feature tag, multiple models for separate features, and a multi-task model for all features), and adding linguistic constraints. We find that, although simply fine-tuning transformers to predict a monolithic tag may already yield decent results, each of these adaptations can further improve tagging accuracy.

Downloads

Transcript English (automatic)

Next from ACL 2024

Exploring intertextuality across the Homeric poems through language models
workshop paper

Exploring intertextuality across the Homeric poems through language models

ACL 2024

John Pavlopoulos
Maria Konstantinidou and 2 other authors

15 August 2024

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Lectures
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2023 Underline - All rights reserved