llama-cpp-python
llama-cpp-python copied to clipboard
TypeError in low_level_api_chat_cpp.py due to Incorrect Type passed
In the file examples/low_level_api/low_level_api_chat_cpp.py, a wrong type is returned from lines L316-L317.
Returned str
, should be llama_token
aka c_int
.
This issue subsequently causes an error in line L358.
Frequency: Sometimes
argument 2: TypeError: wrong type
File "/workspace/llama-plugins/interactive.py", line 428, in output
yield llama_cpp.llama_token_to_str(self.ctx, id).decode("utf-8")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/llama-plugins/interactive.py", line 439, in interact
for i in self.output():
File "/workspace/llama-plugins/interactive.py", line 494, in <module>
m.interact()
ctypes.ArgumentError: argument 2: TypeError: wrong type
P.S. Line numbers in the snippet above are incorrect due to the applied code formatting tool.
Here is two ways to fix it.
- Return EOS
if len(self.embd) > 0 and self.embd[-1] == llama_cpp.llama_token_eos():
if not self.params.instruct:
yield llama_cpp.llama_token_eos()
break
- Tokenize string
" [end of text]\n"
, and return tokens.