RAFT icon indicating copy to clipboard operation
RAFT copied to clipboard

Training RAFT-small takes longer than RAFT

Open enric1994 opened this issue 4 years ago • 0 comments

I compared training with train_mixed.sh using the normal model and the --small For some reason, the small takes longer (I also increased the batch size).

Which parameters should I use to train using the small model faster? Here you have the command I used: python -u train.py --name raft-chairs-small --small --stage chairs --validation chairs --gpus 0 --num_steps 120000 --batch_size 16 --lr 0.00025 --image_size 368 496 --wdecay 0.0001 --mixed_precision

Thanks!

enric1994 avatar Jan 31 '21 09:01 enric1994