nanoGPT
nanoGPT copied to clipboard
Running multi GPU inference
Hi,
I was using the sample.py file for running inference. While it works fine on a single GPU, I wanted to know how I can run it across multiple GPUs?