hollama icon indicating copy to clipboard operation
hollama copied to clipboard

Support OpenAI API format

Open anagri opened this issue 1 year ago • 9 comments

Support OpenAI API format by giving option to switch between Ollama proprietary API format and OpenAI API format.

To fetch list of models - https://platform.openai.com/docs/api-reference/models/list?lang=node.js

To generate chat - https://platform.openai.com/docs/api-reference/chat?lang=node.js

Will help servers following OpenAI API format be used with Hollama. E.g. Bodhi App - https://github.com/BodhiSearch/BodhiApp

anagri avatar Jul 18 '24 09:07 anagri

I'm not sure entirely sure I understand this request, or what the use-case would be.

It sounds like you want Hollama to expose an API using the OpenAI schema so other services can interact with Ollama using Hollama as a proxy of sorts? is that correct?

fmaclen avatar Jul 18 '24 13:07 fmaclen

I understand that he rather wants to connect to OpenAI-API-compatible servers that do not use the ollama-API.

binarynoise avatar Jul 19 '24 14:07 binarynoise

@binarynoise What are "OpenAI-API-compatible servers"?

Is the suggested feature for adding a way to set your own OpenAI API key and be able to choose to get a completion from Ollama or OpenAI's API?

fmaclen avatar Jul 19 '24 20:07 fmaclen

What are "OpenAI-API-compatible servers"?

Bodhi App seems to be one:

It also exposes these LLM inference capabilities as OpenAI API compatible REST APIs.

@anagri would have to clarify the rest.

binarynoise avatar Jul 20 '24 15:07 binarynoise

tx @binarynoise

@fmaclen - many of the apps that run LLMs locally prefer exposing APIs using OpenAI API format. These includes jan.ai, lm-studio, bodhi app etc.

so if hollama can start supporting OpenAI APIs, we can use hollama UI to connect to the APIs exposed by these apps. and if someone wants to use their OpenAI API key, they can use that as well.

tldr; - OpenAI APIs are more of industry standard than proprietary Ollama APIs. Hollama get can a wider audience and adoptability if they start supporting OpenAI APIs.

anagri avatar Jul 20 '24 15:07 anagri

@binarynoise @anagri thanks for the clarification, I understand what you mean now.

fmaclen avatar Jul 20 '24 19:07 fmaclen

To expand a bit, I'm not opposed to the idea of allowing multiple completion servers, this is however not a trivial implementation and we have other priorities at the moment.

If someone wants to give this a shot I'd start by using ollama.ts as the base template to handle the logic of the other servers.

There are also a few factors to consider for this implementation:

  • Should the user be allowed to switch servers during a New session?
  • Is it reasonable to expect these "OpenAI-compatible servers" to return a list of available models or does it need to be set manually in the Settings view?
  • Do we need to sign the requests with an API key? or does that vary from server to server?
  • If we do need to sign the requests, is the auth standard too?
  • Do these servers all have the same API endpoints?

fmaclen avatar Jul 20 '24 23:07 fmaclen

References

  • OpenAI's API: https://platform.openai.com/docs/api-reference/authentication
  • Open Web UI: https://github.com/open-webui/pipelines/blob/main/examples/pipelines/providers/openai_manifold_pipeline.py
  • Anthropic's API: https://docs.anthropic.com/en/api/getting-started
  • Ollama with OpenAI interface: https://github.com/ollama/ollama/blob/main/docs/openai.md

fmaclen avatar Sep 16 '24 13:09 fmaclen

  • [ ] Investigate if OpenAI's JS library can receive stop completion requests, or if we need to use fetch
  • [ ] Add OpenAI as a new completion strategy, side by side with our current Ollama implementation
  • [ ] Add unique identifier for Ollama or OpenAi models
  • [ ] Check the current connected/disconnected logic and find ways to simplify implementation
  • [ ] Settings UI components behave similarly to Ollama Settings
  • [ ] Help text should link to: https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key
  • [ ] In the models menu highlight OpenAI's models with an OpenAI <Badge>
  • Handle errors when:
    • [ ] Network connection to OpenAI/Ollama is broken
    • [ ] Model no longer available

fmaclen avatar Sep 17 '24 14:09 fmaclen