TaskWeaver icon indicating copy to clipboard operation
TaskWeaver copied to clipboard

Have a issue to use embedding function with openai compatible embedding model

Open elohffa opened this issue 1 year ago • 3 comments

Describe the bug I have a problem when run python -m plugin_mgt --refresh , becasue I wanna enable th embedding feature, however I got a error message as below as telling me a json decoder error

python -m plugin_mgt --show --refresh Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "/home/elohlinux/TaskWeaver/scripts/plugin_mgt.py", line 66, in plugin_manager.refresh() File "/home/elohlinux/TaskWeaver/scripts/plugin_mgt.py", line 45, in refresh self.plugin_selector.refresh() File "/home/elohlinux/TaskWeaver/scripts/../taskweaver/code_interpreter/code_generator/plugin_selection.py", line 100, in refresh plugin_embeddings = self.llm_api.get_embedding_list([text for idx, text in plugins_to_embedded]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/elohlinux/TaskWeaver/scripts/../taskweaver/llm/init.py", line 268, in get_embedding_list return self.embedding_service.get_embeddings(strings) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/elohlinux/TaskWeaver/scripts/../taskweaver/llm/openai.py", line 234, in get_embeddings embedding_results = self.client.embeddings.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/elohlinux/TaskWeaver/venv/lib/python3.11/site-packages/openai/resources/embeddings.py", line 108, in create return self._post( ^^^^^^^^^^^ File "/home/elohlinux/TaskWeaver/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1179, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/elohlinux/TaskWeaver/venv/lib/python3.11/site-packages/openai/_base_client.py", line 868, in request return self._request( ^^^^^^^^^^^^^^ File "/home/elohlinux/TaskWeaver/venv/lib/python3.11/site-packages/openai/_base_client.py", line 961, in _request return self._process_response( ^^^^^^^^^^^^^^^^^^^^^^^ File "/home/elohlinux/TaskWeaver/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1055, in _process_response return api_response.parse() ^^^^^^^^^^^^^^^^^^^^ File "/home/elohlinux/TaskWeaver/venv/lib/python3.11/site-packages/openai/_response.py", line 242, in parse parsed = self._parse() ^^^^^^^^^^^^^ File "/home/elohlinux/TaskWeaver/venv/lib/python3.11/site-packages/openai/_response.py", line 221, in _parse data = response.json() ^^^^^^^^^^^^^^^ File "/home/elohlinux/TaskWeaver/venv/lib/python3.11/site-packages/httpx/_models.py", line 756, in json return jsonlib.loads(self.text, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/elohlinux/anaconda3/lib/python3.11/json/init.py", line 346, in loads return _default_decoder.decode(s) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/elohlinux/anaconda3/lib/python3.11/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/elohlinux/anaconda3/lib/python3.11/json/decoder.py", line 355, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

To Reproduce

  1. I update the taskweaver_config.json as following config { "llm.api_base": "http://192.168.0.101:8080/v1", "llm.api_key": "xxxx" "llm.model": "gpt-4-1106-preview", "llm.embedding_api_type": "openai", "llm.embedding_model": "text-embedding-ada-002", "llm.response_format": "json_object" }

2 . I run run_pytest.sh , it can pass the test 3. I run python -m plugin_mgt --refresh it report above error

Environment Information (please complete the following information):

  • OS: [e.g. ubuntu22 ,]
  • Python Version [3.11.3]
  • LLM that you're using: [GPT-4]
  • Other Configurations except the LLM api/key related: the llm is using a openai compatible embedding model Additional context Add any other context about the problem here.

elohffa avatar Jan 21 '24 16:01 elohffa

Hi, @elohffa! Why did you call OpenAI api via local api base? Maybe there is a misalignment between LLM API and embedding API.

zhangxu0307 avatar Jan 22 '24 10:01 zhangxu0307

the reason is i dont know how to implement my emdding model , so I wrap it as like openai api standard , so that why i call OpenAI api via local api base . so any option to add my local as a extra endpoint with custom embedding model ?

elohffa avatar Jan 24 '24 04:01 elohffa

Currently, we do not support customized embedding models beyond the pre-existing models that are listed here.

zhangxu0307 avatar Jan 24 '24 06:01 zhangxu0307