LLaMA-Adapter
LLaMA-Adapter copied to clipboard
multi-gpu
"AssertionError: Loading a checkpoint for MP=1 but world size is 2" when I set --nproc_per_node to 2.
How to run on 2 16G gpu? because it OOM when inference.
Thanks