Akshay Dudhane
Akshay Dudhane
Yes, you understood correctly. As burst is of 4 dimensions, we don't have any alternative other than keeping batch size 1. One thing you can try is to combine burst...
For synthetic burst SR, we train the proposed Burstormer for 300 epochs with 4 RTX6000 GPUs, and it takes roughly around 5-6 days. To fine-tune this model for real burst...
Burstormer is faster than BIPNet. In our settings, we keep precision 16 and deterministic True. In released codes, each GPU serves a single burst (so 4 GPUs combinely make batch...