llama-cpp-python
llama-cpp-python copied to clipboard
Create simple_local_chat.py
easy to use high level chat script
Hi, thanks for this example!
When I keep talking with it I got this error, is there a way to avoid this error?
ValueError: Requested tokens (527) exceed context window of 512