Abdul Waheed Aagha
Abdul Waheed Aagha
sh configs/r50_motr_train.sh /home/user/anaconda3/envs/motr2/lib/python3.8/site-packages/torch/distributed/launch.py:180: FutureWarning: The module torch.distributed.launch is deprecated and will be removed in future. Use torchrun. Note that --use_env is set by default in torchrun. If your script expects...
The process stuck at torch.distributed.barrier() Here is my env information PyTorch version: 1.13.0 Is debug build: False CUDA used to build PyTorch: 11.7 ROCM used to build PyTorch: N/A OS:...
Also in the following command, I can not see the usage of "generator_path" param python pipeline.py \ --prompt "Rose Valentines' Day" \ --mode "background" \ --encoder_path /path/to/encoder \ --decoder_path /path/to/decoder...
Thanks for the response. Let me try
# Evaluation Script EXP=swin DATASET=webui COMMAND=category_generate python main.py --encode_backbone swin --encode_embd 1024 \ --dataset $DATASET --exp $EXP --evaluate \ --decoder_path ../logs/$DATASET/$EXP/checkpoints/decoder.pth \ --encoder_path ../logs/$DATASET/$EXP/checkpoints/encoder.pth \ --eval_command $COMMAND \ --calculate_harmony \...
# Training Script python main.py --dataset webui --exp layout \ --data_dir ../data \ --epoch 100 --lr 1.5e-5 --lr_decay \ --encode_backbone swin --encode_embd 1024 \ --finetune_vb --pretrain_vb # Evaluation Command EXP=layout...