DiffDock
DiffDock copied to clipboard
GPU memory is used while running inference.py changing the device to cpu
Hi, I observed while running the command [ time python -m inference --config default_inference_args.yaml --protein_ligand_csv data/test_1.csv --out_dir results_L_cpu/user_predictions_small] the GPU memory is used while the device is changed to 'cpu' in inference.py. Can anyone help me out how to completely run the above script in cpu only?
The simplest thing would be to hide by environment variable
export CUDA_VISIBLE_DEVICES=-1 time python -m inference --config default_inference_args.yaml --protein_ligand_csv data/test_1.csv --out_dir results_L_cpu/user_predictions_small