This paper addresses the problem of Graph Domain Adaptation (GDA) in the presence of noisy source graph labels. Unlike existing GDA methods that assume clean source labels, we propose a novel framework, Nested Graph Pseudo-Label Refinement (NeGPR). NeGPR pretrains dual branches (semantic and topological) that enhance neighborhood consistency in the feature space to mitigate the effects of noise. It selects high-confidence target samples through a nested refinement mechanism to guide adaptation in the other branch, and integrates a noise-aware regularization strategy to mitigate overfitting and the negative effects of pseudo-label noise in the source domain. Experimental results show that NeGPR outperforms state-of-the-art methods in severe label-noise environments, achieving up to a 12.7% accuracy improvement.