Flowise
Flowise copied to clipboard
[FEATURE] Add support for llama.cpp and Weaviate
Describe the feature you'd like Could you please add support for
- llama.cpp embedding
- llama.cpp llm
- Weaviate vector DB
Additional context I got LangChain + OpenAI + Pinecone working for conversational Q&A retrieval against enterprise knowledge base, but would like to use open source and locally run alternative components (llama.cpp for embedding and LLM, Weaviate for vector DB). Thus my enterprise data will be on premise. Thank you.
Thanks for the suggestion!
As this project is built on top of LangChainJS, llama.cpp is still not yet supported for now. It is also requested here: https://github.com/hwchase17/langchainjs/issues/710
We can definitely add Weaviate DB soon. For now, Supabase, Chroma are open source which you can run locally as well
Weaviate was recently added to LangChainJS, same interface as the other VectorStores. Also perhaps have a look at https://github.com/go-skynet/LocalAI it's a way of running open models locally with an API that is compatible with OpenAIs API interfaces. Should be a great stopgap while there is no llama.cpp
Weaviate is now added to Flowise - https://github.com/FlowiseAI/Flowise/pull/45
Closing this as you can now use LocalAI https://github.com/FlowiseAI/Flowise/pull/123 to run local LLMs and embeddings. So far this is the only viable solution for JS/TS based project
Weaviate is now added to Flowise - #45
Im desperate in need of a tutorial to use it with flowise,
Weaviate is now added to Flowise - #45
Seems to be still not listed in the docs. Any particular reason or just an oversight? => https://docs.flowiseai.com/integrations/langchain/vector-stores