Gustaf Ahdritz
Gustaf Ahdritz
Yeah that would be great actually.
Send it to my Gmail, which is just my GitHub username.
I *think* I resolved this in 6c89015. @lzhangUT, @bing-song, @epenning could you verify that the inference script works on your systems now?
Could you try without docker?
Excellent. I've since pushed a fix that should work for Docker. Could you give it a try? If that still doesn't work, could you change `compute_capability, _` to `compute_capability, error`...
You did the edit slightly wrong---you should replace `compute_capability, _` with `compute_capability, error` and then print `error`, not replace it with `compute_capability`.
Yes exactly.
Thanks! Do those autocast fixes work during DeepSpeed training, where an APEX-based autocast framework is used instead of the native torch one? The reason those operations are spelled out like...
For now, you should definitely use the hacky AlphaFold-Gap implementation on the main branch. The multimer branch is still experimental (I've had zero time to work on that recently), and...
Those are just symbolic names. You need to change `fasta_dir` etc. to the names of actual directories containing the corresponding files. `fasta_dir` should be a directory containing .fasta files whose...