llama-stack
llama-stack copied to clipboard
Use the list endpoint instead of ps to get ollama's models
What does this PR do?
Change the endpoint used to determine what models are available in ollama from /api/ps to /api/tags
- [x] Addresses issue #332
Feature/Issue validation/testing/test plan
To reproduce the problem observed in the issue above:
- Start the
ollamaserver (ensure there are compatible models installed) - Ensure there is no current process running any model with
ollama ps - Start the llama-stack server and no model will be found, although the models are usable in ollama
After the change, there is no need to have a process running the models in Ollama. It'll pick them up anyway.
Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Ran pre-commit to handle lint / formatting issues.
- [x] Read the contributor guideline, Pull Request section?
- [x] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.