qabot
qabot copied to clipboard
Configurable LLM via library
qabot shouldn't assume OpenAI is the LLM provider - support Gemini, Claude, DeepSeek as well as local models.
Need to look into options, perhaps llm would work.