chat-langchain
chat-langchain copied to clipboard
Functionality with local models
Hi, I've modified the ingest script to work with a local LLM, but I'm trying to get the chat system working with local models and I'm finding a lot of OpenAI function calls- can these be replaced with tooling for local models or is this system based on secret sauce from OpenAI?