/tools/invariant-point-attention
invariant-point-attention
lucidrains/invariant-point-attention
171 stars12 forksPythonAdded February 8, 2026
summary
Invariant Point Attention is a standalone PyTorch module designed for coordinate refinement in protein structures, particularly utilized within the AlphaFold2 framework. It allows for the processing of molecular representations to enhance the accuracy of protein folding predictions.
description
Implementation of Invariant Point Attention, used for coordinate refinement in the structure module of Alphafold2, as a standalone Pytorch module
topics
artificial-intelligencedeep-learningprotein-folding
Ratings
N/A
0 ratings
Rate this tool: