UncertainUrza
Results
1
issues of
UncertainUrza
What would it take to configure Buttercup to make LLM calls against a local inference engine (i.e. Ollama, Llama.cpp, vLLm, etc)? For folks who want to experiment with this project...