ChatGLM2-6B
ChatGLM2-6B copied to clipboard
[BUG/Help] <cli_demo.py seems to have unuse import readline>
Is there an existing issue for this?
- [X] I have searched the existing issues
Current Behavior
in the line 5 import readline seems useless and cause ModuleNotFoundError
Expected Behavior
No response
Steps To Reproduce
just run the cli_demo.py
Environment
- OS:win 10
- Python:3.11
- Transformers:4.27.1
- PyTorch:2.0.1+cu118
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :true
Anything else?
It seems a lapsus calami, or am I missing the function of readline?
I delete the import and the script run well.