litellm icon indicating copy to clipboard operation
litellm copied to clipboard

[Bug]: response_json["eval_count"] doesn't exists - llms/ollama.py

Open mongolu opened this issue 7 months ago • 2 comments

What happened?

A bug happened!

after a while, this error pops out

Relevant log output

'created_at': '2024-01-10T08:52:17.111694849Z',
 'done': True,
 'eval_duration': 516371613757000,
 'load_duration': 260310,
 'model': 'MixtralOrochi8x7B:latest',
 'response': '',
 'total_duration': 306412003}
Traceback (most recent call last):
  File "/opt/miniconda3/lib/python3.11/site-packages/litellm/llms/ollama.py", line 325, in ollama_acompletion
    completion_tokens = response_json["eval_count"]
                        ~~~~~~~~~~~~~^^^^^^^^^^^^^^
KeyError: 'eval_count'

Twitter / LinkedIn details

No response

mongolu avatar Jan 10 '24 11:01 mongolu

While we are waiting for an official fix, you can do this:

edit the file: lib/python3.11/site-packages/litellm/llms/ollama.py At line 322 change this: completion_tokens = response_json["eval_count"] to be this: completion_tokens = response_json.get("eval_count", 10)

This makes the count 10 if the eval_count parameter isn't returned by the model server (in my case ollama).

Hopefully the devs will fix this properly soon.

Speedway1 avatar Jan 14 '24 02:01 Speedway1

I also get this in 1.18.8

Shadoweee77 avatar Jan 21 '24 21:01 Shadoweee77

The fix is being reviewed here: https://github.com/BerriAI/litellm/pull/1514

puffo avatar Jan 24 '24 03:01 puffo