gpu_poor
gpu_poor copied to clipboard
why batch size does not effect to memory usage in inference mode
Different batch size doesn't seem to affect GPU memory usage when set in INFERENCE MODE? This doesn't seem to make sense. Is that normal?