crewAI-tools
crewAI-tools copied to clipboard
Allow for LM Studio to be usable for RAG instead of just open AI
Currently the RAG base class composes with the official OpenAI Python Package. It is desirable to instead only use the calling convention of OpenAI. This allows for using Local LLMs (such as Llama 3) for producing embeddings in conjunction with Open AIs calling conventions.
I believe you can now do this using this strategy on the docs: https://docs.crewai.com/tools/PDFSearchTool/#custom-model-and-embeddings
You can also set local .env variables to configure the client for local LLM use.
OPENAI_API_BASE=http://localhost:1234/v1
OPENAI_MODEL_NAME=TheBloke/Mistral-7B-Instruct-v0.1-GGUF/mistral-7b-instruct-v0.1.Q2_K.gguf
OPENAI_API_KEY=na
This is working for me locally using LM Studio.