Fabric
                                
                                 Fabric copied to clipboard
                                
                                    Fabric copied to clipboard
                            
                            
                            
                        [Bug]: PraisonAI Compatibility Issues with Non-OpenAI Keys
What happened?
I'm experiencing difficulties when trying to use PraisonAI through Fabric (vai fabric -a). Despite having a valid API key from openrouter, I receive authentication errors from fabric that make it seems as though its attempting to communicate with openai directly. Does this meanPraisonAI only works with OpenAI?
After inputting my question, Fabric spends several seconds processing before spitting out error messages to standard output.
...snip...
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\openai\_base_client.py", line 1020, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-or-v1*************************************************************7abd. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
Version check
- [X] Yes I was.
Relevant log output
PS C:\Users\XXX> fabric -a
Enter Question: Why is the sky blue?
Starting PraisonAI...
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\XXX\.local\bin\fabric.exe\__main__.py", line 7, in <module>
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\installer\client\cli\fabric.py", line 120, in main
    standalone.agents(text)
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\installer\client\cli\utils.py", line 430, in agents
    praison_ai.main()
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\praisonai\cli.py", line 69, in main
    self.agent_file = generator.generate()
                      ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\praisonai\auto.py", line 45, in generate
    response = self.client.chat.completions.create(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\instructor\patch.py", line 570, in new_create_sync
    response = retry_sync(
               ^^^^^^^^^^^
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\instructor\patch.py", line 387, in retry_sync
    for attempt in max_retries:
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\tenacity\__init__.py", line 435, in __iter__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\tenacity\__init__.py", line 368, in iter
    result = action(retry_state)
             ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\tenacity\__init__.py", line 410, in exc_check
    raise retry_exc.reraise()
          ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\tenacity\__init__.py", line 183, in reraise
    raise self.last_attempt.result()
          ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ProgramData\anaconda3\Lib\concurrent\futures\_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "C:\ProgramData\anaconda3\Lib\concurrent\futures\_base.py", line 401, in __get_result
    raise self._exception
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\instructor\patch.py", line 390, in retry_sync
    response = func(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\openai\_utils\_utils.py", line 277, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\openai\resources\chat\completions.py", line 590, in create
    return self._post(
           ^^^^^^^^^^^
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\openai\_base_client.py", line 1240, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\openai\_base_client.py", line 921, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "C:\Users\XXX\AppData\Local\pipx\pipx\venvs\fabric\Lib\site-packages\openai\_base_client.py", line 1020, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-or-v1*************************************************************7abd. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
Relevant screenshots (optional)
No response
Same issue. I am using DeepSeek V2. OpenAI is way too expensive compared to these models and I would prefer to use these.
I sort of fixed it by modifying the code within PraisonAI python file to point to my own Custom AI baseURL and Model. Would be easier if we can change it at the runtime through CLI.
Curious if you'd be willing to quickly outline the steps & files necessary. 😃
I'm not a coder but am in technical field so I used my thinking cap. I went inside Praisonai installation folder and changed auto.py file. It has mentioned of OpenAI and ChatGpt 4 model. I changed the base URL and model name based on Deepseek info and it worked! Idea is to find files which has hard coded info. Of OpenAI and model and change it to your preferred API provider.
Thanks so much for the breakdown! I appreciate it!
are u using praison ai for agents
Yes! It's all done on the fly when you use a prompt using Echo command followed by something like '| fabric --agents'
I was also able to get it working with LM Studio though, the Open Source Models are not taking the commands properly.  This is my .env file under /home/
@xssdoctor Can you take a look at this please? It looks like a lot of people are wanting to use this. And we should add it to the upcoming Go version as well.
I am having the same issue; it seems to ignore the OPENAI_BASE_URL and OPENAI_API_KEY environment variables and instead try to hit OPENAI APIs directly.
Try what I did and modify the files directly. Look at my response until they fix it.
Here's how I fixed it :