Chinese-LLaMA-Alpaca
Chinese-LLaMA-Alpaca copied to clipboard
Support multi-gpu transformers inference
Add a script for multi-gpu inference
Is it possible to use one script for both single-gpu and multi-gpu inference (for instance, users can launch the script with --single-gpu
(default) or --multi-gpu
) ? so that we only have to maintain one script.
Thank you for your contributions. We are indeed interested in your multi-gpu implementation. As requested, please consider merging the original inference script and yours into a unified one, so that the users can use with more flexibilities.
Okay, I will finish that work, thank you for your concern.
Is it possible to use one script for both single-gpu and multi-gpu inference (for instance, users can launch the script with
--single-gpu
(default) or--multi-gpu
) ? so that we only have to maintain one script.
Thank you for your advice. I will complete the task as per your instructions.