[Question]: Implement OpenAI compatible API via IONOS
Describe your problem
Hey I hope you can help me.
I want to add an OpenAI compatible model (Llama 3.1 8B and Llama 3.1 70B) via IONOS. The model can be added but as soon as I try to access it, I get the following error
ERROR: Error code: 401 -
{
'httpStatus': 401,
'messages': [{
'errorCode': 'paas-auth-1',
'message': 'Unauthorized, wrong or no api key provided to process this request'
}]
}
However, the authentication is not incorrect. I tested it with the OpenAI Python Library and locally. But unfortunately not via Ragflow.
Maybe it's a bug report instead of just a question?... Thanks for helping!
@KevinHuSh If I remove the “Bearer ” in the authentication, i can add the model. However, this cannot be the solution, as the bearer is important in principle.
@KevinHuSh If I remove the “Bearer ” in the authentication, i can add the model. However, this cannot be the solution, as the bearer is important in principle.
When I am using the API without the "Bearer ", I get the following Error ERROR: 'NoneType' object has no attribute 'split' in my chat. Tested with version 0.17.2-slim
I would suggest, that we will change the issue type to a bug instead of a question? @KevinHuSh Do you need any other information from my side?
Issue will closed due to a new bug report