[BUG] Wrong request to chat completion endpoint on OpenAI API compatible server
Bug Description Please provide a clear and concise description of what the bug is.
Steps to Reproduce Please provide the steps to reproduce the bug:
- Go to "Add custom provider"
- Add the data fro your custom provider. I'm using an OpenAI API compatible LLM server running in a machine in my local network
- Ask question to LLM, in a thread
Expected Results I expect to get a response from the LLM.
Actual Results An error appears with:
A network error has occurred. Please check your current network status and the connection with <IP:PORT of the LLM server in my local network>.
Network Error: Failed to fetch
Checking the logs of my LLM server, I notice that the request is being made with the HTTP method OPTIONS instead of POST
Screenshots
Screenshot of the app in my phone:
Screen shot of my server's logs:
(It's a Flask application running on Gunicorn)
Desktop (please complete the following information):
- Operating System: Android
- Application Version: v1.4.0
Additional Context
I noticed that the desktop apps (at least Linux and Mac ones) are already in v1.4.1. And they don't have this problem (they do the requests with POST)
Also, I don't find in this repo a way to build the mobile apps. How can I do it?
The same problem on ios app.
It seems that the mobile APP can only use API server that supports CORS (Cross-Origin Resource Sharing), while PC clients do not have such requirement. I tested it on my own server. It works fine with PC clients. But Android APP always say "Network Error: Failed to fetch". I added the following HTTP headers, and the problem solved. Access-Control-Allow-Origin: * Access-Control-Allow-Methods: POST,OPTIONS Access-Control-Allow-Headers: Authorization,Connection,Content-Type,Content-Length
@fluxlinkage is right. It seems like Android always makes Preflight requests when making HTTP requests.
I solved it by changing my custom server to:
- Support CORS
- Support OPTION requests to all of my endpoints (these request don't need to return anything)
I eventually forgot about this issue. I'm going to close it.