Ollama download model if not present
Super glad to see the Ollama support in the recent releases! This has made the tool far more accessible for most who do not wish to risk a large OpenAI bill.
Is your feature request related to a problem? Please describe. When using mods on a new system, it's always frustrating when you invoke mods directed towards Ollama but it fails when you do not have the model on your system.
Describe the solution you'd like
It would be great if mods detects (maybe using the /api/show or /api/tags calls) whether the model you are trying to invoke is present before calling it. If it is not present, download the model (using the /api/pull request) and display some text to the user saying something along the lines of "The model you tried to invoke is not present, pulling...".
Currently, it just fails with an "Unknown API error."
Describe alternatives you've considered The alternative is to download the model before invoking Mods.
Additional context N/A
I'm not sure this would sit well within the scope of mods, probably could do a better job with the error, though.