Juhan Bae
Results
3
issues of
Juhan Bae
Currently, the maximum query batch size is set to the maximum batch size the model can run. Enabling larger query batch size should allow IF analysis significantly faster.
enhancement
On the latest version, we get the warning message: ``` FutureWarning: `torch.cuda.amp.GradScaler(args...)` is deprecated. Please use `torch.amp.GradScaler('cuda', args...)` instead. ```