InvokeAI
InvokeAI copied to clipboard
[bug]: can not import the models in ollama
trafficstars
Is there an existing issue for this problem?
- [x] I have searched the existing issues
Operating system
macOS
GPU vendor
None (CPU)
GPU model
No response
GPU VRAM
No response
Version number
1.4.1
Browser
Chrome Version 131.0.6778.205 (Official Build) (arm64)
Python dependencies
No response
What happened
Add models: Url or local path ~/.ollama/models/manifests/registry.ollama.ai/library/llama3.2/latest: unrecognized suffix
What you expected to happen
invoke can use the models in ollama
How to reproduce the problem
No response
Additional context
No response
Discord username
No response
Invoke doesn't ship with anything related to ollama. Maybe this is not the right place for this bug report?