TaskWeaver icon indicating copy to clipboard operation
TaskWeaver copied to clipboard

Exception: OpenAI API request was not permitted: <!DOCTYPE html> while running local model Qwen

Open Haxeebraja opened this issue 1 year ago • 3 comments
trafficstars

I am trying to run Taskweaver with a locally hosted model Qwen-72B-Chat. Both model and taskweaver are running within the same docker container without internet. Model hosted using: https://github.com/QwenLM/Qwen/blob/main/openai_api.py

It works fine with same configuration with autogen however fails with Taskweaver.

taskweaver_config.json { "llm.api_base": "http://localhost:8787/v1", "llm.api_key": "Null", "llm.model": "Qwen-72B-Chat" } Also tried adding: , "llm.response_format": "json_object", "llm.api_type":"openai"

Exception in thread Thread-3 (base_stream_puller): <=💡=> Traceback (most recent call last): File "/workspace/mounted/TaskWeaver/taskweaver/llm/openai.py", line 163, in chat_completion res: Any = self.client.chat.completions.create( File "/usr/local/lib/python3.10/dist-packages/openai/_utils/_utils.py", line 272, in wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/openai/resources/chat/completions.py", line 645, in create return self._post( File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1088, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 853, in request return self._request( File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 930, in _request raise self._make_status_error_from_response(err.response) from None openai.PermissionDeniedError:

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/usr/lib/python3.10/threading.py", line 1016, in _bootstrap_inner self.run() File "/usr/lib/python3.10/threading.py", line 953, in run self._target(*self._args, **self._kwargs) File "/workspace/mounted/TaskWeaver/taskweaver/llm/init.py", line 162, in base_stream_puller for msg in stream: File "/workspace/mounted/TaskWeaver/taskweaver/llm/openai.py", line 224, in chat_completion raise Exception(f"OpenAI API request was not permitted: {e}") Exception: OpenAI API request was not permitted:

Haxeebraja avatar Jan 24 '24 07:01 Haxeebraja

Could you try set "llm.api_type":"openai" and "llm.response_format": null? do not wrap null with anything like ".

liqul avatar Jan 24 '24 08:01 liqul

Could you try set "llm.api_type":"openai" and "llm.response_format": null? do not wrap null with anything like ".

Get the same error with this config as well

Haxeebraja avatar Jan 24 '24 08:01 Haxeebraja

After openai permission denied I get the following that was missed out in the traceback. I have a closed network and can't access internet unless a url has been whitelisted by security team. Is it possible that Taskweaver is trying to access this URL from openai: http://www.w3.org/1999/xhtml

<! DOCTYPE html > < html xmlns="http://www.w3.org/1999/xhtml" > < body > < /body > < /html >

Haxeebraja avatar Jan 24 '24 10:01 Haxeebraja

Fixed this issue after serving the model with FastChat.

Haxeebraja avatar Jan 25 '24 12:01 Haxeebraja