
Injy Sarhan
Doctoral student @ Utrecht University
data augmentation
pre-trained language models
multi-stage fine-tuning
data-enriched fine-tuning. multi-task fine tuning
1
presentations
SHORT BIO
Injy Sarhan is a 4th year Ph.D. student at Utrecht University, The Netherlands. Her work is primarily focused on Natural Language Processing (NLP), mainly information Extraction and Knowledge Graph construction. Moreover, she is currently exploring other Machine-Learning aspects for automatic ontology construction using reinforcement learning. Injy's latest paper was published in Knowledge-Based System journal by El-Sevier: “Open-CyKG: An Open Cyber Threat Intelligence Knowledge Graph.” on Knowledge Graph construction for Cybersecurity reports using an attention-based Open Information Extraction (OIE) model. During her Ph.D. Injy published several other works on OIE including: “Uncovering Algorithmic Approaches in Open Information Extraction ( Benelux 2018), "Contextualized Word Embeddings in a Neural Open Information Extraction Model." (NLDB 2019) and “Can We Survive without Labelled Data in NLP? Transfer Learning for Open Information Extraction.” (Appl. Sci 2020) where she Evaluated the transferability of Neural Open Information Extraction systems to different domains and tasks such as Relation Extraction to diminish the complication of insufficient training data of neural network models in various tasks and encourage model generalization.
Presentations

UU-Tax at SemEval-2022 Task 3: Improving the generalizability of language models for taxonomy classification through data augmentation
Injy Sarhan