llama-cpp-python icon indicating copy to clipboard operation
llama-cpp-python copied to clipboard

TypeError in low_level_api_chat_cpp.py due to Incorrect Type passed

Open zatevakhin opened this issue 1 year ago • 0 comments

In the file examples/low_level_api/low_level_api_chat_cpp.py, a wrong type is returned from lines L316-L317. Returned str, should be llama_token aka c_int. This issue subsequently causes an error in line L358.

Frequency: Sometimes

argument 2: TypeError: wrong type
  File "/workspace/llama-plugins/interactive.py", line 428, in output
    yield llama_cpp.llama_token_to_str(self.ctx, id).decode("utf-8")
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/workspace/llama-plugins/interactive.py", line 439, in interact
    for i in self.output():
  File "/workspace/llama-plugins/interactive.py", line 494, in <module>
    m.interact()
ctypes.ArgumentError: argument 2: TypeError: wrong type

P.S. Line numbers in the snippet above are incorrect due to the applied code formatting tool.

Here is two ways to fix it.

  1. Return EOS
if len(self.embd) > 0 and self.embd[-1] == llama_cpp.llama_token_eos():
    if not self.params.instruct:
        yield llama_cpp.llama_token_eos()
    break
  1. Tokenize string " [end of text]\n", and return tokens.

zatevakhin avatar Apr 15 '23 18:04 zatevakhin