InvokeAI icon indicating copy to clipboard operation
InvokeAI copied to clipboard

[bug]: can not import the models in ollama

Open zq19 opened this issue 8 months ago • 1 comments
trafficstars

Is there an existing issue for this problem?

  • [x] I have searched the existing issues

Operating system

macOS

GPU vendor

None (CPU)

GPU model

No response

GPU VRAM

No response

Version number

1.4.1

Browser

Chrome Version 131.0.6778.205 (Official Build) (arm64)

Python dependencies

No response

What happened

Add models: Url or local path ~/.ollama/models/manifests/registry.ollama.ai/library/llama3.2/latest: unrecognized suffix

What you expected to happen

invoke can use the models in ollama

How to reproduce the problem

No response

Additional context

No response

Discord username

No response

zq19 avatar Mar 04 '25 06:03 zq19

Invoke doesn't ship with anything related to ollama. Maybe this is not the right place for this bug report?

psychedelicious avatar Mar 04 '25 13:03 psychedelicious