How to run RFpeptides on the Google Colab?
Hello. I'm a Japanese student and trying to run RFpeptides on the Google Colab by Mr. Sergey Ovchinnikov. However, because of my poor knowledge about python program, I don't understand which columns and how I should change to add inference.cyclic and inference.cyc_chains flags for designing macrocyclic peptides. I am very afraid of asking this kind of basic question but I'm waiting for some solutions. Sorry for my bad English.
Hello,
Unfortunately Sergey's Colab notebook is not yet set up to run the new RFpeptide capabilities, which is why you are having trouble finding where to change the inference.cyclic and inference.cyc_chains configuration options.
For now, my recommendation would be to run RFdiffusion either by using the Docker image or by installing RFdiffusion locally or on a computing cluster you have access to and then using the examples (examples/design_macrocyclic_monomer.sh and examples/design_macrocyclic_binder.sh) to get started.
If you use the Docker image here's what the command will look like for running the design_macrocyclic_monomer.sh example using Docker:
docker run -it --rm \
-v ./outputs:/outputs \
rosettacommons/rfdiffusion \
inference.model_directory_path=/app/RFdiffusion/models \
inference.output_prefix=/outputs/rfpeptide_test \
inference.input_pdb=app/RFdiffusion/examples/input_pdbs/7zkr_GABARAP.pdb \
inference.num_designs=2 \
'contigmap.contigs=[12-18]' \
inference.cyclic=True \
diffuser.T=50 \
inference.cyc_chains='a'
If you are using Apptainer/Singularity it will look like this:
apptainer pull rfd.sif docker://rosettacommons/rfdiffusion
apptainer run --nv -B ./outputs:/outputs ~/rfd.sif \
inference.model_directory_path=/app/RFdiffusion/models \
inference.schedule_directory_path=schedules \
inference.input_pdb=/app/RFdiffusion/examples/input_pdbs/7zkr_GABARAP.pdb \
inference.output_prefix=outputs/rfpeptide_test \
inference.num_designs=2 \
'contigmap.contigs=[12-18]' \
inference.cyclic=True \
diffuser.T=50 \
inference.cyc_chains='a'
Thank you so much for your speedy responce! I'll try it!