Chinese-LLaMA-Alpaca icon indicating copy to clipboard operation
Chinese-LLaMA-Alpaca copied to clipboard

Support multi-gpu transformers inference

Open sunyuhan19981208 opened this issue 1 year ago • 3 comments

Add a script for multi-gpu inference

sunyuhan19981208 avatar May 08 '23 11:05 sunyuhan19981208

Is it possible to use one script for both single-gpu and multi-gpu inference (for instance, users can launch the script with --single-gpu (default) or --multi-gpu) ? so that we only have to maintain one script.

airaria avatar May 10 '23 04:05 airaria

Thank you for your contributions. We are indeed interested in your multi-gpu implementation. As requested, please consider merging the original inference script and yours into a unified one, so that the users can use with more flexibilities.

Okay, I will finish that work, thank you for your concern.

sunyuhan19981208 avatar May 10 '23 07:05 sunyuhan19981208

Is it possible to use one script for both single-gpu and multi-gpu inference (for instance, users can launch the script with --single-gpu (default) or --multi-gpu) ? so that we only have to maintain one script.

Thank you for your advice. I will complete the task as per your instructions.

sunyuhan19981208 avatar May 10 '23 07:05 sunyuhan19981208