robotcator

Results 24 comments of robotcator

@zoq Great, I follow your solution and it works now. Thank @KimSangYeon-DGU for the helpful suggestion too. But can we catch up with the change of gym, for the newer...

Hi, if you want to use the optimized flash attention code, you can check out the [code here](https://github.com/robotcator/Uni-Fold/tree/flash-attn-bias). And this [document](https://github.com/robotcator/Uni-Fold/blob/flash-attn-bias/unifold/modules/flash_attn_readme.md) may be helpful. Hope this can help you.

Can you provide some details for the installation of flash attention? It seems that the backward did not work correctly.

Can you write a single test for the `flash_attn` interface with the shape of the input like `[1, 292, 292, 128]`, so that we can test the function whether works...