This is a page that curates AI-related papers published worldwide. All content here is summarized using Google Gemini and operated on a non-profit basis. Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.
This paper proposes the Equivariant Spherical Transformer (EST) to overcome the limitations of SE(3)-equivariant Graph Neural Networks (GNNs) in molecular system modeling. Conventional message-passing methods using tensor product-based convolution suffer from limited expressive power due to insufficient nonlinearity and incomplete group representations. EST addresses these issues by leveraging the Transformer structure within the spatial domain of the group representation after the Fourier transform. We demonstrate theoretically and experimentally that EST achieves superior expressive power while encompassing the functional space of tensor product, and guarantees homogeneous transform-induced bias through a uniform sampling strategy for the Fourier transform. We experimentally demonstrate that EST achieves state-of-the-art performance on various molecular benchmarks, such as OC20 and QM9.
Takeaways, Limitations
•
Takeaways:
◦
We present a new framework, EST, that overcomes the expressive power limitations of SE(3)-equivariant GNNs.
◦
Experimentally verified superior expressive power over tensor multiplication-based convolution.
◦
Ensuring homogeneity through Fourier transform-based uniform sampling.
◦
Achieving state-of-the-art performance across a variety of molecular benchmarks.
•
Limitations:
◦
Lack of analysis and comparison of the computational complexity of EST.
◦
Further research is needed on generalization performance for other types of molecular systems or larger datasets.
◦
Lack of in-depth analysis of the impact of the choice of a particular sampling strategy on performance.