[공지사항]을 빙자한 안부와 근황 
Show more

Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

Oversmoothing Alleviation in Graph Neural Networks: A Survey and Unified View

Created by
  • Haebom

Author

Yufei Jin, Xingquan Zhu

Outline

This paper comprehensively analyzes and classifies existing methods to solve the oversmoothing problem in graph neural networks (GNNs). Oversmoothing refers to the phenomenon in which the embedding features become similar as the layers of the GNN deepen, making it difficult to distinguish network proximity. In this paper, we present various existing oversmoothing mitigation techniques from an integrated perspective called ATNPA, which consists of five core steps: Augmentation, Transformation, Normalization, Propagation, and Aggregation. In addition, we propose a classification scheme that includes three topics for solving oversmoothing, and classify existing methods into six categories to analyze in detail the relationship with ATNPA, advantages and disadvantages of each method, etc., thereby providing an in-depth understanding of existing research and suggesting future research directions.

Takeaways, Limitations

Takeaways:
We systematically analyzed and classified GNN over-smoothing mitigation techniques from an integrated perspective called ATNPA to increase understanding of existing studies.
We propose a new classification system for alleviating over-smoothing and suggest future research directions.
By comparing and analyzing the pros and cons of various techniques, the applicability and limitations of each technique were clarified.
Limitations:
Further validation is needed to determine whether the ATNPA framework fully encompasses all existing methods.
The proposed classification scheme does not perfectly classify all existing studies and may require modification as new techniques emerge.
There is a lack of experimental analysis to compare the performance of each technique.
👍