Federated Learning (FL) using graph-structured data typically suffers from the non-IID problem, where each client has a different subgraph sampled from the global graph. In this paper, we introduce FedAux (Federated Learning with Auxiliary Projections), a personalized subgraph FL framework that aligns, compares, and aggregates heterogeneously distributed local models without sharing raw data or node embeddings. In FedAux, each client jointly learns (i) a local GNN and (ii) a learnable auxiliary projection vector (APV) that differentially projects node embeddings into 1D space. Soft alignment operations and lightweight 1D convolutions refine these embeddings in the aligned space, allowing the APV to effectively capture client-specific information. After local learning, these APVs serve as compact signatures that the server uses to compute cross-client similarities and perform similarity-weighted parameter blending, generating personalized models while maintaining knowledge transfer across clients. Additionally, we provide rigorous theoretical analysis to establish the convergence and rationality of the design. Experimental evaluations on various graph benchmarks demonstrate that FedAux significantly outperforms existing baselines in both accuracy and personalization performance.