langflow
langflow copied to clipboard
Support for LocalAI
LocalAI is a OpenAI drop-in API replacement with support for multiple model families to run LLMs on consumer-grade hardware, locally. As it is compatible with OpenAI, it just requires to set the base path as parameter in the OpenAI client, there is also an example with langchain-js and langchain-python in the LocalAI repository here: https://github.com/go-skynet/LocalAI/blob/577d36b5964ba0873b945f2ea00404ed52556408/examples/langchain/langchainjs-localai-example/src/index.mts#L7
Hey, @mudler
We were discussing adding them this week but couldn't decide the approach because it seems the OpenAI node would suffice.
What do you think would be a good approach?
Hey, @mudler
We were discussing adding them this week but couldn't decide the approach because it seems the OpenAI node would suffice.
What do you think would be a good approach?
Indeed, however, I think it's more comfortable as well for the users to find the proper integration. In the case of LocalAI the user must provide a baseURL and not a token, using OpenAI might result to be confusing.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
ping?
+1 on localAI, it would also make it posible to use local embeddings, also openai integration does not seem to to allow custom models names
LocalAI is just a matter of changing the OpenAI API base.
That option is in the button as shown:
With the features we are adding this week we'll be able to more easily set up that kind of component
thx. Can you also make so that the model name field in OpenAI accepts free text also
thx. Can you also make so that the model name field in OpenAI accepts free text also
i've just posted this as an issue before researching this thread. +1 (im new to posting issues)