Flowise
Flowise copied to clipboard
[FEATURE] Add OPENAI_API_BASE to UI to enable usage oobabooga openai plugin
Describe the feature you'd like oobabooga Text-generation UI has a plugin which emulates the openAI API. But in the UI of Flowise it is not possible to set the URL of OpenAI. Setting the OPENAI_API_BASE via .env doesn't seem to work
Do you have the link to the plugin?
Do you have the link to the plugin?
It is build-in already: https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai
When I use Langchain in python I just have to set OPENAI_API_KEY and OPENAI_API_BASE environment variables OPENAI_API_KEY="dummy" OPENAI_API_BASE="http://127.0.0.1:5001/v1"
Instead of running oobabooga to serve your local LLM, you can use LocalAI instead. It gives you an OpenAI compatible API based on whatever model you choose to run. If you set up LocalAI from the repo here: https://github.com/go-skynet/LocalAI
Then you can use the Existing ChatLocalAI node in flowise: See example here https://github.com/go-skynet/LocalAI/tree/master/examples/flowise
@TheMasterFX with this PR merged - https://github.com/FlowiseAI/Flowise/pull/264 you can now specify a base path
Instead of running oobabooga to serve your local LLM, you can use LocalAI instead. It gives you an OpenAI compatible API based on whatever model you choose to run. If you set up LocalAI from the repo here: https://github.com/go-skynet/LocalAI
As far as I can tell LocalAI doesn't seem to support GPTQ models (on GPU)
As far as I can tell LocalAI doesn't seem to support GPTQ models (on GPU)
Looking at the readme and build instructions on their github it looks like they have GPU support now, if still experiential, I'll try it out myself soon.
I am attempting to use the LocalAI module with the oobabooga backend. It seems like both are intended to work as openai drop in replacements so in theory I should be able to use the LocalAI node with any drop in openai replacement, right? Well.. maybe not because I can't get it working. It may be that the LocalLLM node only needs to be modified slightly to get it to support other backends (or I am doing something wrong).
Start oobabooga with
call python server.py --auto-devices --chat --wbits 4 --groupsize 128 --api --listen --extension openai
Then enter the API url in the model in the UI http://127.0.0.1:5001/v1
However a LLMs might have different Keywords like "### Instruction:" or "User:"
However a LLMs might have different Keywords like "### Instruction:" or "User:"
Where do find these keywords? Any template-suggestions for some known LLMs? 🙏🏽
However a LLMs might have different Keywords like "### Instruction:" or "User:"
Where do find these keywords? Any template-suggestions for some known LLMs? 🙏🏽
This might be a good source: https://github.com/oobabooga/text-generation-webui/tree/main/instruction-templates
This might be a good source: https://github.com/oobabooga/text-generation-webui/tree/main/instruction-templates
Thanks! Do you suggest any models + templates for flowise? I've tried a bunch of LLMs but none of them work as expected.