Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Few-shot knowledge graph (KG) relational learning aims to perform reasoning over relations with only a handful of training examples. While most existing methods for few-shot relational learning have primarily focused on leveraging relational information, rich semantics inherent in KGs have been largely overlooked. To address this gap, we propose a novel prompted meta-learning (PromptMeta) framework that seamlessly integrates meta-semantics with relational information for few-shot relational learning. PrompMeta has two key innovations: (1) a Meta-Semantic Prompt (MSP) pool that consolidates high-level meta-semantics, enabling effective knowledge transfer and adaptation to rare and newly emerging relations. (2) a learnable fusion token that dynamically combines meta-semantics with task-specific relational information tailored to different few-shot tasks. Both components are optimized jointly with model parameters within a meta-learning framework. Extensive experiments and analyses on two real-world KG datasets demonstrate the effectiveness of our approach. The code and datasets are available at: https://github.com/A20250216/ARR02.