OpenAI Proxy base url CORS errors
Describe the bug When attempting to fetch data from an OpenAI proxy URL (https://xxxxx.xxx/chat/completions) within Obsidian, the request is blocked due to a CORS policy error. The console displays an error indicating that the response to the preflight request doesn't pass the access control check because no 'Access-Control-Allow-Origin' header is present on the requested resource.
To Reproduce
- Configure Obsidian to use an OpenAI proxy URL.
- Attempt to fetch data from the OpenAI API through the proxy.
- Observe the CORS policy error in the console.
Expected behavior The expected behavior is that the proxy server should handle CORS appropriately by including the necessary Access-Control-Allow-Origin header in the response, allowing requests from app://obsidian.md. This would enable the application to communicate with the OpenAI API without encountering CORS policy errors.
Proposed Fix If an OpenAI proxy base url is provided, I think that the proxy server should be started similar to what is done for the Claude model, but providing the OpenAI proxy base url.
Additional context To enhance user experience and flexibility, I would suggest support for multiple proxy configurations that may be selected from the models drop-down. This way, you could still make use of the OpenAI models if wanted as well, rather than simply overwriting the request if a proxy address is provided. This feature would cater to a variety of use cases and preferences, enabling a more seamless and efficient workflow within Obsidian.
First thanks for your amazing work!
I actually run in the same Issue but from a different Use Case:
We are running our own Text Generation Inference API of Huggingface with same models where some of them share the same API as OpenAI therefore we can actually use the openAI package to use our Inference. But when trying to use the proxy i ran into the same CORS Error. What a pitty, that would be amazing to have, not only for the Chat Model but also for the Embeddings, where our models also can be used same way as OpenAI.
Hope this is fixed soon :)
I had this issue with LocalAI - the solution was to start LocalAI with --cors