llama-cpp-python icon indicating copy to clipboard operation
llama-cpp-python copied to clipboard

Create simple_local_chat.py

Open Mrgithub93 opened this issue 2 years ago • 1 comments

easy to use high level chat script

Mrgithub93 avatar Aug 05 '23 19:08 Mrgithub93

Hi, thanks for this example!

When I keep talking with it I got this error, is there a way to avoid this error?

ValueError: Requested tokens (527) exceed context window of 512

delock avatar Oct 03 '23 17:10 delock