To overcome the limitations of existing methods for performing lateral sentiment analysis (ABSA) using dependency-syntax trees and contextual semantics, this paper proposes a novel model based on optimal transport, called OTESGN. OTESGN integrates syntactic-semantic graph-aware attention with semantic-optimal transport attention to effectively model syntactic dependencies and subtle semantic alignments. In particular, semantic-optimal transport attention accurately captures important opinion words even in noisy words, thereby accurately identifying sentiment signals. The performance and robustness of the model are enhanced through an adaptive attention fusion module and contrastive regularization. Experimental results show that the proposed model achieves +1.01% F1 and +1.30% F1 improvements over the existing state-of-the-art models on Twitter and Laptop14 benchmarks, respectively.