[공지사항]을 빙자한 안부와 근황 
Show more

Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

Can we ease the Injectivity Bottleneck on Lorentzian Manifolds for Graph Neural Networks?

Created by
  • Haebom

Author

Srinitish Srinivasan, Omkumar C.U.

Outline

Lorentzian Graph Isomorphic Network (LGIN) is a novel HGNN proposed to address the limited discriminative power of hyperbolic GNNs, which are promising for hierarchical data. To improve the low representational power caused by non-partial aggregation of conventional hyperbolic GNNs, we introduce a novel update rule that effectively captures rich structural information while preserving the Lorentzian metric. Through extensive evaluations on nine benchmark datasets, we demonstrate its ability to capture complex graph structures, outperforming or on par with state-of-the-art hyperbolic and Euclidean baseline models. LGIN is the first model to apply the principles of powerful and highly discriminative GNN architectures to the Riemannian manifold.

Takeaways, Limitations

Takeaways:
We suggest the possibility of developing more expressive GNNs on the Riemannian manifold.
We present a novel update rule that overcomes the discrimination limitations of existing hyperbolic GNNs.
We validate the effectiveness of our model by achieving state-of-the-art performance on various benchmark datasets.
We present a novel method to apply powerful GNN architectures to the Riemannian manifold.
Limitations:
The paper does not specifically mention Limitations. Further analysis and experiments are needed to elucidate Limitations.
👍