ImpAI
ImpAI copied to clipboard
Can you use this with Ollama?
Like, pointing the text based LLM calls to Ollama.
And if I have stable diffusion models already, how to point it at them?
Thank you
Hi ! Thanks you for using my project ! I don't try Ollama with ImpAI but if Ollama expose a server port like llama.cpp, you just need to edit the url server in the backend.
For Stable Diffusion, it's an other thing because all my Hugging Face Pipeline is in the backend.
But if you have suggestions, I would be happy to take them!