AAAI 2026 Main Conference

January 22, 2026

Singapore, Singapore

Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.

With the increase in the number of open sourced models available to the average consumer, Low-Rank Adaptation (LoRA) have become essential tools for adapting large language models with limited computational resources (Hu et al., 2021). LoRA works by introducing trainable low-rank matrices A and B to update pre-trained weights efficiently, significantly reducing memory and compute requirements compared to full fine-tuning. The original motivation was that the weight update during fine-tuning could be estimated by ∆W ≈ AB, and thus could drastically reduce the number of trainable parameters while still capturing task-specific adaptations, significantly reducing the number of parameters that needs to be stored in memory. While recent work has shown that LoRA is not strictly equivalent to full fine-tuning (Shuttleworth et al., 2024), it remains an efficient and practical method for adapting models to specific tasks, playing a crucial role in the democratization of AI. Despite being efficient, LoRA has been shown to be suboptimal for finetuning models with large embedding dimensions, due to differences in the magnitudes of the values of A and B (Hayou et al., 2024; Yen et al., 2024; Zhang & Pilanci, 2024). While there are existing LoRA variants that claim to handle these problems (Hayou et al., 2024; Yen et al., 2024; Bensaïd et al., 2025; Zhang & Pilanci, 2024), we would like to rigorously evaluate the performance of these LoRA variants on a variety of tasks and models, and investigate an alternative and novel approach, which introduces a penalty.

Downloads

Paper

Next from AAAI 2026 Main Conference

Native Speech Processing with LLMs
poster

Native Speech Processing with LLMs

AAAI 2026 Main Conference

Aaron Soh

22 January 2026

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved