RAFT
RAFT copied to clipboard
Training RAFT-small takes longer than RAFT
I compared training with train_mixed.sh
using the normal model and the --small
For some reason, the small takes longer (I also increased the batch size).
Which parameters should I use to train using the small model faster?
Here you have the command I used:
python -u train.py --name raft-chairs-small --small --stage chairs --validation chairs --gpus 0 --num_steps 120000 --batch_size 16 --lr 0.00025 --image_size 368 496 --wdecay 0.0001 --mixed_precision
Thanks!