OnlyOffice AI support
Is your feature request related to a problem? Please describe. Using Ollama and openAI option WRN Client error ip=1 latency="121.604µs" method=OPTIONS status=401 url=/v1/models
https://www.onlyoffice.com/ai-assistants.aspx
Incorrect HTTP Method: The endpoint /v1/model may not support the method (e.g., OPTIONS) being used—many AI API endpoints only allow GET or POST.
Misconfigured AI Provider Integration: The method sent by ONLYOFFICE or another client does not match the LocalAI API specification for that route
curl -X OPTIONS http://IPv1/model -H "Authorization: Bearer key" 405 Not Allowed
Describe the solution you'd like
JS file
class Provider extends AI.Provider {
constructor() {
super(
"LocalAI",
"http://localhost:8080/v1", // change to your LocalAI endpoint
"yourLocalAIAPIKey", // your key if applicable
"v1" // model/API version as required
);
}
}
Onlyoffice to connect to LocalAI
Same issue, trying to use my local inference llama.cpp server but OnlyOffice doesn't know what to do.
@Splarkszter ollama works with this, where LocalAI doesn't work I have ollama to fill the gab.