Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

OTESGN:Optimal Transport Enhanced Syntactic-Semantic Graph Networks for Aspect-Based Sentiment Analysis

Created by
  • Haebom

Author

Xinfeng Liao, Xuanqi Chen, Lianxi Wang, Jiahuan Yang, Zhuowei Chen, Ziying Rong

Outline

To overcome the limitations of existing methods for performing lateral sentiment analysis (ABSA) using dependency-syntax trees and contextual semantics, this paper proposes a novel model based on optimal transport, called OTESGN. OTESGN integrates syntactic-semantic graph-aware attention with semantic-optimal transport attention to effectively model syntactic dependencies and subtle semantic alignments. In particular, semantic-optimal transport attention accurately captures important opinion words even in noisy words, thereby accurately identifying sentiment signals. The performance and robustness of the model are enhanced through an adaptive attention fusion module and contrastive regularization. Experimental results show that the proposed model achieves +1.01% F1 and +1.30% F1 improvements over the existing state-of-the-art models on Twitter and Laptop14 benchmarks, respectively.

Takeaways, Limitations

Takeaways:
Presenting OTESGN, a new lateral sentiment analysis model based on optimal transportation.
Effectively model syntactic dependencies and subtle semantic alignments.
Robust to noise and capable of accurately identifying the location of opinion words
Achieving SOTA performance on Twitter and Laptop14 benchmarks
Limitations:
The specific Limitations is not explicitly mentioned in the paper.
Focused on improving performance for specific domains (Twitter, Laptop), generalizability to other domains requires further research.
Lack of consideration of the computational complexity of optimal transport-based attention mechanisms.
👍