How to configure the Hugging Face API? There is no Hugging Face option in the UI.
How to configure the Hugging Face API? There is no Hugging Face option in the UI. This configuration doesn't work.
openhands: image: docker.all-hands.dev/all-hands-ai/openhands:0.13 container_name: openhands restart: always extra_hosts: - host.docker.internal:host-gateway environment: - LOG_ALL_EVENTS=true - SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.13-nikolaik - LLM_API_KEY=hf_xxx - LLM_MODEL=huggingface/Qwen/Qwen2.5-Coder-32B-Instruct ports: - 127.0.0.1:8102:3000 volumes: - /var/run/docker.sock:/var/run/docker.sock pull_policy: always tty: true stdin_open: true
Looking at the Litellm docs for huggingface: https://docs.litellm.ai/docs/providers/huggingface The LLM_MODEL looks right.
What is the error you are getting? Trying to understand if it's having issues connecting to the LLM or something with the additional configs you have given it
Can confirm that Huggingface doesn't work as of Nov 15 on main branch.
Model: https://huggingface.co/deepseek-ai/DeepSeek-Coder-V2-Instruct-0724
Errors in logs repeat like this:
Provider List: https://docs.litellm.ai/docs/providers
12:03:42 - openhands:INFO: manager.py:31 - Conversation 5646e7f2-3af6-4c67-8bca-98d9e799598f connected in 2.4713730812072754 seconds
12:03:42 - openhands:INFO: manager.py:31 - Conversation 5646e7f2-3af6-4c67-8bca-98d9e799598f connected in 0.9300143718719482 seconds
INFO: 172.17.0.1:41440 - "GET /api/list-files HTTP/1.1" 200 OK
12:03:43 - openhands:ERROR: retry_mixin.py:47 - litellm.APIError: HuggingfaceException - Original Response received: ; Stacktrace: Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/requests/models.py", line 974, in json
return complexjson.loads(self.text, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/json/__init__.py", line 346, in loads
return _default_decoder.decode(s)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/huggingface_restapi.py", line 693, in completion
completion_response = response.json()
^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/requests/models.py", line 978, in json
raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)
requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
. Attempt #1 | You can customize retry values in the configuration.
Provider List: https://docs.litellm.ai/docs/providers
CC @enyst would appreciate a look when you have some time 🙏