profile picture

Aditya Shah

Graduate student @ Virginia Tech

transfer learning

language models

prompt tuning

1

presentations

SHORT BIO

I completed my Research based Master's degree from Department of Computer Science, Virginia Tech. My research focus is on using Large Language Models (LLMs) for different downstream applications. My Master's thesis was on “Leveraging Transformer Models and Elasticsearch to Help Prevent and Manage Diabetes through EFT Cues.” Previously, I have had the pleasure of working at Google Brain as a Research Intern. I'll be joining Capital One Headquarters in McLean, Virginia, as a senior data scientist.

Presentations

ADEPT: Adapter-based Efficient Prompt Tuning Approach for Language Models Aditya Shah, Surendrabikram Thapa, Aneesh Jain and Lifu Huang (surendrabikram@vt.edu) Status: Accept

Aditya Shah and 3 other authors

Stay up to date with the latest Underline news!

Select topic of interest (you can select more than one)

PRESENTATIONS

  • All Presentations
  • For Librarians
  • Resource Center
  • Free Trial
Underline Science, Inc.
1216 Broadway, 2nd Floor, New York, NY 10001, USA

© 2025 Underline - All rights reserved