RFdiffusion
RFdiffusion copied to clipboard
RuntimeError: mat1 and mat2 shapes cannot be multiplied (111556x92 and 103x64)
Hi I want to design binder for my target. My code:
python /app/RFdiffusion/scripts/run_inference.py
inference.input_pdb=/app/data/inputs/e.pdb
inference.output_prefix=/app/data/outputs/E_complex_model
inference.num_designs=1
'contigmap.contigs=[E24-357/0,50-80]'
'ppi.hotspot_res=[E250,E251,E252,E253,E254,E255,E256,E257,E258,E259]'
diffuser.T=15
inference.ckpt_override_path=/app/RFdiffusion/models/Complex_Fold_base_ckpt.pt
denoiser.noise_scale_ca=0
denoiser.noise_scale_frame=0
but error: RuntimeError: mat1 and mat2 shapes cannot be multiplied (111556x92 and 103x64)
I'm not too much sure but maybe It's because of pdb or contig. For contig you may try replacing "," with " ". In pdb case for my usage I didn't use hetatms and water, maybe this was the issue.
Can you provide input and output files so that we can try to recreate your issue? I'm not sure what's creating your issue, but the more information you give the better we can diagnose it.
A few notes, please ignore if you chose these values for specific reasons:
- Setting the noise scale for the CA and frame to 0 is not recommended, this will reduce the diversity of your designs.
- Having 15 timesteps is a bit small, the smallest number I have seen recommended is 20.
My input (target protein) is three chains of a CRISPR. I received this error while I using (inference.ckpt_override_path=/app/RFdiffusion/models/Complex_Fold_base_ckpt.pt) I cant write my code without this.
So it will run with other checkpoint files?
I'm getting a similar error when I try to run with the InpaintSeq_Fold_ckpt checkpoint file; if I provide the same inputs and the InpaintSeq_ckpt checkpoint, the program runs fine.
I was able to construct a simple minimal example starting from 3IOL. This command runs fine:
run_inference.py \
inference.input_pdb="3IOL.pdb" \
'contigmap.contigs=[B10-35/0 70-100]' \
'ppi.hotspot_res=[B28,B29]' \
'inference.num_designs=1' \
'contigmap.inpaint_str=[B10-35]' \
inference.schedule_directory_path='./output' \
inference.ckpt_override_path='/app/RFdiffusion/models/InpaintSeq_ckpt.pt' \
but this command exits with an error:
run_inference.py \
inference.input_pdb="3IOL.pdb" \
'contigmap.contigs=[B10-35/0 70-100]' \
'ppi.hotspot_res=[B28,B29]' \
'inference.num_designs=1' \
'contigmap.inpaint_str=[B10-35]' \
inference.schedule_directory_path='./output' \
inference.ckpt_override_path='/app/RFdiffusion/models/InpaintSeq_Fold_ckpt.pt' \
The error (similar to the above) is:
RuntimeError: mat1 and mat2 shapes cannot be multiplied (12321x92 and 103x64)
and here's the full traceback:
Error executing job with overrides: ['inference.input_pdb=3IOL.pdb', 'contigmap.contigs=[B10-35/0 70-100]', 'ppi.hotspot_res=[B28,B29]', 'inference.num_designs=1', 'contigmap.inpaint_str=[B10-35]', 'inference.schedule_directory_path=./output', 'inference.ckpt_override_path=/my/local/path/to/software/RFdiffusion/models/InpaintSeq_Fold_ckpt.pt']
Traceback (most recent call last):
File "/my/local/path/to/prd/conda_envs/SE3nv/bin/run_inference.py", line 7, in <module>
exec(compile(f.read(), __file__, 'exec'))
File "/my/local/path/to/software/RFdiffusion/scripts/run_inference.py", line 195, in <module>
main()
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/hydra/main.py", line 94, in decorated_main
_run_hydra(
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/hydra/_internal/utils.py", line 394, in _run_hydra
_run_app(
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/hydra/_internal/utils.py", line 457, in _run_app
run_and_report(
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/hydra/_internal/utils.py", line 223, in run_and_report
raise ex
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/hydra/_internal/utils.py", line 220, in run_and_report
return func()
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/hydra/_internal/utils.py", line 458, in <lambda>
lambda: hydra.run(
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/hydra/_internal/hydra.py", line 132, in run
_ = ret.return_value
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/hydra/core/utils.py", line 260, in return_value
raise self._return_value
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/hydra/core/utils.py", line 186, in run_job
ret.return_value = task_function(task_cfg)
File "/my/local/path/to/software/RFdiffusion/scripts/run_inference.py", line 94, in main
px0, x_t, seq_t, plddt = sampler.sample_step(
File "/my/local/path/to/software/RFdiffusion/rfdiffusion/inference/model_runners.py", line 722, in sample_step
msa_prev, pair_prev, px0, state_prev, alpha, logits, plddt = self.model(msa_masked,
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
File "/my/local/path/to/software/RFdiffusion/rfdiffusion/RoseTTAFoldModel.py", line 99, in forward
pair, state = self.templ_emb(t1d, t2d, alpha_t, xyz_t, pair, state, use_checkpoint=use_checkpoint)
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
File "/my/local/path/to/software/RFdiffusion/rfdiffusion/Embeddings.py", line 253, in forward
templ = self.emb(templ) # Template templures (B, T, L, L, d_templ)
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/torch/nn/modules/linear.py", line 96, in forward
return F.linear(input, self.weight, self.bias)
File "/my/local/path/to/prd/conda_envs/SE3nv/lib/python3.9/site-packages/torch/nn/functional.py", line 1847, in linear
return torch._C._nn.linear(input, weight, bias)
RuntimeError: mat1 and mat2 shapes cannot be multiplied (12321x92 and 103x64)
Let me know if there's any other information I can provide!
Thank you! We'll take a look!