OpenFactVerification icon indicating copy to clipboard operation
OpenFactVerification copied to clipboard

ollama LOCAL_API_URL not working

Open papiche opened this issue 1 year ago • 1 comments

I ried to register my ollama node into api_config.yaml

SERPER_API_KEY: null
OPENAI_API_KEY: null
ANTHROPIC_API_KEY: null
LOCAL_API_KEY: anykey
LOCAL_API_URL: http://127.0.0.1:11434

But encounter an error

python webapp.py --api_config api_config.yaml
== Init decompose_model with model: gpt-4o
[INFO]2024-09-11 20:58:57,178 __init__.py:61: == LLMClient is not specified, use default llm client.
Traceback (most recent call last):
  File "/home/frd/workspace/OpenFactVerification/webapp.py", line 84, in <module>
    factcheck_instance = FactCheck(
                         ^^^^^^^^^^
  File "/home/frd/workspace/OpenFactVerification/factcheck/__init__.py", line 63, in __init__
    setattr(self, key, LLMClient(model=_model_name, api_config=self.api_config))
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/frd/workspace/OpenFactVerification/factcheck/utils/llmclient/gpt_client.py", line 15, in __init__
    self.client = OpenAI(api_key=self.api_config["OPENAI_API_KEY"])
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/frd/miniconda3/lib/python3.12/site-packages/openai/_client.py", line 105, in __init__
    raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

What should I do ? thanks

papiche avatar Sep 11 '24 19:09 papiche

I am also encountering the same problem. Seems like ollama is currently not supportedd.

PardonMySkillz avatar Oct 17 '24 08:10 PardonMySkillz