WizardLM
WizardLM copied to clipboard
Multi gpu for 30B
How can we use the 30B model on multiple gpus? Is there a straight python implementation?