/tools/flash_ipa
flagshippioneering/flash_ipa
FlashIPA is an implementation that improves the efficiency of the Invariant Point Attention module, focusing on reducing training and inference time for models that may be used in molecular simulations. It provides a framework for integrating edge embeddings, which can be crucial for modeling molecular interactions.
[NeurIPS 2025 spotlight] Efficient factorized variant of the IPA module.
Rate this tool: