InvokeAI icon indicating copy to clipboard operation
InvokeAI copied to clipboard

[bug]: Fail to set "Clip" " VAE" and "text encode" models for Flux dev model (In Place install option) by model manager

Open tokitoki22 opened this issue 1 year ago • 4 comments

Is there an existing issue for this problem?

  • [X] I have searched the existing issues

Operating system

Windows

GPU vendor

Nvidia (CUDA)

GPU model

RTX 3090

GPU VRAM

24GB

Version number

5.4.2

Browser

Brave

Python dependencies

No response

What happened

The Model Manager is unable to determine the model type for VAE and CLIP files required for the Flux dev model. As a result, I cannot "invoke" any images with Flux.

For example, it fails to install (or recognize) these model files, which work without any issues in other AI image-generation apps:

clip_l.safetensors t5xxl_fp16.safetensors ae.safetensors

faileda

Here’s an error log snippet:

[2024-11-28 11:53:11,445]::[ModelInstallService]::INFO --> Model install started: D:/programs/StabilityMatrix/Models/CLIP/clip_l.safetensors [2024-11-28 11:53:11,466]::[ModelInstallService]::ERROR --> Model install error: D:/programs/StabilityMatrix/Models/CLIP/clip_l.safetensors InvalidModelConfigException: Unable to determine model type for D:\programs\StabilityMatrix\Models\CLIP\clip_l.safetensors

This issue occurs for all VAE, CLIP, and text encoder files needed for Flux. These files are provided and recommended by Black Forest Labs for the Flux dev model.

What you expected to happen

These model files should be installed (recognized as model files) by invoke model manager (place installed option).

How to reproduce the problem

Install app then go to model section, and set Flux vae and clip model file paths to install models with "place installed option" to aovid double same files.

Additional context

No response

Discord username

No response

tokitoki22 avatar Nov 28 '24 03:11 tokitoki22

+1

This prevents me from being able to bring up a fresh install of invoke-ai that is pre-loaded with a set of flux models.

dreness avatar Jan 22 '25 06:01 dreness

Same, it's unfortunately unusable for Flux in this state. How do you get these to load? Is there any work around? Even installing from huggingface with the model manager all over again didn't work. I guess you can't use FLUX with Invoke, that's a bummer.

Update: I was able to go to the model manager and under "Starter Models" I was able to install FLUX dev. That worked. Unfortunately it didn't work with my existing FLUX models. Weird. I don't know the difference between them, but I do get a little different results I think and would have preferred to use the models I already used with ComfyUI. I don't want duplicates. At least this seems to work now though if you're all in on Invoke I guess.

tmaiaroto avatar Feb 01 '25 04:02 tmaiaroto

Same, it's unfortunately unusable for Flux in this state. How do you get these to load?

I'm able to download the flux stuff through the model manager, after cleaning up what is left over after a failed attempt to import models fetched by a previous instance of InvokeAI. The model in my backup is identical to what is downloaded by the model manager; I think the reason the HF install works but the local install doesn't is probably related to some metadata that the model manager gets from HF that isn't part of a zipped models directory.

dreness avatar Feb 03 '25 08:02 dreness

if you download the file by hand you can place it in the correct dir, also remember to rename it <name>.<extension> e.g. flux1-fill-dev.safetensors -> FLUX Fill.safetensors (use quotes as necessary) https://github.com/invoke-ai/InvokeAI/blob/c5069557f385648005bcb76b3fdee9e29adc3f8f/invokeai/backend/model_manager/starter_models.py#L681

Then scan for models either everywhere or just in that dir, and it should now recognize it rather than just "unable to determine the model type"

06kellyjac avatar Sep 18 '25 20:09 06kellyjac