ChatGLM-6B
ChatGLM-6B copied to clipboard
[BUG/Help] <如何在docker中布署CPU版本?>
Is there an existing issue for this?
- [X] I have searched the existing issues
Current Behavior
目前提供的docker镜像是使用GPU的版本,如何在DOCKER中布署CPU版本?
Expected Behavior
No response
Steps To Reproduce
总是报错:不要在GPU下布署CPU系统
Environment
- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :
?
Anything else?
No response
Is there an existing issue for this?
- [x] I have searched the existing issues
Current Behavior
目前提供的docker镜像是使用GPU的版本,如何在DOCKER中布署CPU版本?
Expected Behavior
No response
Steps To Reproduce
总是报错:不要在GPU下布署CPU系统
Environment
- OS: - Python: - Transformers: - PyTorch: - CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) : ?Anything else?
No response