openai
openai copied to clipboard
Other models besides OpenAI?
A number of open source models like LLaMa 2 can run in local environments where there's a webserver (like LM Studio, Koboldcpp, etc.) that has identical endpoints to OpenAI. Can we have a flag/option that allows for specifying a different OpenAI-compatible endpoint?
@cspenn you can do it in the development version:
remotes::install_github("irudnyts/openai", ref = "r6")
library(openai)
client <- OpenAI(
base_url = "your_base_url"
)
completion <- client$chat$completions$create(
model = "gpt-3.5-turbo",
messages = list(list("role" = "user", "content" = "What's up?"))
)
I'm currently working on all other endpoints, and hopefully, I'll be able to roll out a stable version soon.