This is a page that curates AI-related papers published worldwide. All content here is summarized using Google Gemini and operated on a non-profit basis. Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.
Andrew Liu, Axel Elaldi, Nicholas T Franklin, Nathan Russell, Gurinder S Atwal, Yih-En A Ban, Olivia Viessmann
Outline
Invariant Point Attention (IPA) is an important algorithm for geometry-aware modeling in structural biology, and is central to many protein and RNA models. However, its quadratic complexity limits the input sequence length. In this paper, we introduce FlashIPA, a factorized reconstruction of IPA that achieves linear scaling with sequence length in GPU memory and real-time by leveraging hardware-efficient FlashAttention. FlashIPA achieves performance comparable to or exceeding standard IPA performance while significantly reducing computational cost. FlashIPA scales learning to previously unachievable lengths, and we demonstrate this by retraining generative models without length constraints and generating thousands of residue structures. FlashIPA is available at https://github.com/flagshippioneering/flash_ipa .