llama-cpp-python icon indicating copy to clipboard operation
llama-cpp-python copied to clipboard

Issue with emoji decoding

Open CyberTimon opened this issue 2 years ago • 1 comments

When the model wants to output an emoji, this error comes up:

Debugging middleware caught exception in streamed response at a point where response headers were already sent. Traceback (most recent call last): File "C:\Users\zblac\AppData\Local\Programs\Python\Python310\lib\site-packages\werkzeug\wsgi.py", line 500, in __next__ return self._next() File "C:\Users\zblac\AppData\Local\Programs\Python\Python310\lib\site-packages\werkzeug\wrappers\response.py", line 50, in _iter_encoded for item in iterable: File "C:\Users\zblac\llama.cpp\test\normal.py", line 37, in vicuna for line in response: File "C:\Users\zblac\AppData\Local\Programs\Python\Python310\lib\site-packages\llama_cpp\llama.py", line 370, in _create_completion "text": text[start:].decode("utf-8"), UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf0 in position 0: unexpected end of data

CyberTimon avatar Apr 09 '23 16:04 CyberTimon

I can confirm this issue, but only in streaming mode it appears. In regular mode, it works fine.

Niek avatar Apr 12 '23 09:04 Niek

Closing this. Reopen if #118 didn't fix.

gjmulder avatar May 23 '23 09:05 gjmulder