DeeplyTough icon indicating copy to clipboard operation
DeeplyTough copied to clipboard

DeeplyTough: Learning Structural Comparison of Protein Binding Sites

Results 10 DeeplyTough issues
Sort by recently updated
recently updated
newest added

Hi, I ran into a tricky problem. When I run the command datasets_downloader.sh, it cannot download the full data. I hope someone can help me with this problem.

Bumps [numpy](https://github.com/numpy/numpy) from 1.19.2 to 1.22.0. Release notes Sourced from numpy's releases. v1.22.0 NumPy 1.22.0 Release Notes NumPy 1.22.0 is a big release featuring the work of 153 contributors spread...

dependencies

Hi, I just started out with your tool and run into similar problems as others already reported and wanted to share my solutions, which worked for me. I followed the...

Hello Josh, I am thinking of the possibility of using DeeplyTough as an embedder for protein pockets, so that each pocket is mapped to a vector of descriptors. Could you...

Hi. I executed the command to evaluate on the Vertex dataset or the ProSPECCTS dataset. But I found almost the same error like below. (I exported as $STRUCTURE_DATA_DIR = $DEEPLYTOUGH/datasets_structure....

My solution is use mdtraj==1.9.9 instead. someone says changing the cython version also helps, but I failed.

Hi, glad to see your excellent work! I followed your code to preprocess TOUGH-M1 and Vertex for training and evaluation with `--db_preprocessing set` set `0`, just trying to attain the...

Hello, When I try to run custom_evaluation.py on the example custom pairs, I get the following error: Traceback (most recent call last): File "deeplytough/scripts/custom_evaluation.py", line 69, in main() File "deeplytough/scripts/custom_evaluation.py",...

Hi, I encountered an error when running your code: Traceback (most recent call last): File "/data/zhangjunyi/DeeplyTough/deeplytough/scripts/toughm1_benchmark.py", line 6, in from datasets import ToughM1 File "/data/zhangjunyi/DeeplyTough/deeplytough/datasets/__init__.py", line 1, in from .toughm1...

Dear Authors, Congratulations on an innovative publication! I was wondering if you would be willing to make the pretrained models available? Thanks, Raghav