litellm
litellm copied to clipboard
[Bug]: response_json["eval_count"] doesn't exists - llms/ollama.py
What happened?
A bug happened!
after a while, this error pops out
Relevant log output
'created_at': '2024-01-10T08:52:17.111694849Z',
'done': True,
'eval_duration': 516371613757000,
'load_duration': 260310,
'model': 'MixtralOrochi8x7B:latest',
'response': '',
'total_duration': 306412003}
Traceback (most recent call last):
File "/opt/miniconda3/lib/python3.11/site-packages/litellm/llms/ollama.py", line 325, in ollama_acompletion
completion_tokens = response_json["eval_count"]
~~~~~~~~~~~~~^^^^^^^^^^^^^^
KeyError: 'eval_count'
Twitter / LinkedIn details
No response
While we are waiting for an official fix, you can do this:
edit the file: lib/python3.11/site-packages/litellm/llms/ollama.py At line 322 change this: completion_tokens = response_json["eval_count"] to be this: completion_tokens = response_json.get("eval_count", 10)
This makes the count 10 if the eval_count parameter isn't returned by the model server (in my case ollama).
Hopefully the devs will fix this properly soon.
I also get this in 1.18.8
The fix is being reviewed here: https://github.com/BerriAI/litellm/pull/1514