This paper addresses small-shot relation learning, which performs relational inference on knowledge graphs (KGs) using only a small number of training examples. While existing methods have focused on leveraging specific relational information, the rich semantics inherent in KGs have been overlooked. To address this, we propose a novel prompt meta-learning (PromptMeta) framework that seamlessly integrates meta-semantics and relational information. PromptMeta features two key innovations: (1) a pool of meta-semantic prompts (MSPs) that learn and integrate high-level meta-semantics, enabling effective knowledge transfer and adaptation to rare and emerging relations; and (2) learnable fusion tokens that dynamically combine meta-semantics with task-specific relational information tailored to the small-shot task. Both components are jointly optimized within the meta-learning framework, along with model parameters. Extensive experiments and analysis on two real-world KG datasets demonstrate the effectiveness of PromptMeta in adapting to novel relations with limited data.