llm
llm copied to clipboard
Idea: use new ollama's /api/show for model capabilities
New ollama release 0.6.4 contains new api method /api/show which can be useful to check model capabilities https://github.com/ollama/ollama/releases
This is interesting, thanks for noting it. One thing I worry about is making a fast call a slow call because it now has to hit an API endpoint. But it may be better than me trying to put all that info into the llm library.