[bug]: Fail to set "Clip" " VAE" and "text encode" models for Flux dev model (In Place install option) by model manager
Is there an existing issue for this problem?
- [X] I have searched the existing issues
Operating system
Windows
GPU vendor
Nvidia (CUDA)
GPU model
RTX 3090
GPU VRAM
24GB
Version number
5.4.2
Browser
Brave
Python dependencies
No response
What happened
The Model Manager is unable to determine the model type for VAE and CLIP files required for the Flux dev model. As a result, I cannot "invoke" any images with Flux.
For example, it fails to install (or recognize) these model files, which work without any issues in other AI image-generation apps:
clip_l.safetensors t5xxl_fp16.safetensors ae.safetensors
Here’s an error log snippet:
[2024-11-28 11:53:11,445]::[ModelInstallService]::INFO --> Model install started: D:/programs/StabilityMatrix/Models/CLIP/clip_l.safetensors [2024-11-28 11:53:11,466]::[ModelInstallService]::ERROR --> Model install error: D:/programs/StabilityMatrix/Models/CLIP/clip_l.safetensors InvalidModelConfigException: Unable to determine model type for D:\programs\StabilityMatrix\Models\CLIP\clip_l.safetensors
This issue occurs for all VAE, CLIP, and text encoder files needed for Flux. These files are provided and recommended by Black Forest Labs for the Flux dev model.
What you expected to happen
These model files should be installed (recognized as model files) by invoke model manager (place installed option).
How to reproduce the problem
Install app then go to model section, and set Flux vae and clip model file paths to install models with "place installed option" to aovid double same files.
Additional context
No response
Discord username
No response
+1
This prevents me from being able to bring up a fresh install of invoke-ai that is pre-loaded with a set of flux models.
Same, it's unfortunately unusable for Flux in this state. How do you get these to load? Is there any work around? Even installing from huggingface with the model manager all over again didn't work. I guess you can't use FLUX with Invoke, that's a bummer.
Update: I was able to go to the model manager and under "Starter Models" I was able to install FLUX dev. That worked. Unfortunately it didn't work with my existing FLUX models. Weird. I don't know the difference between them, but I do get a little different results I think and would have preferred to use the models I already used with ComfyUI. I don't want duplicates. At least this seems to work now though if you're all in on Invoke I guess.
Same, it's unfortunately unusable for Flux in this state. How do you get these to load?
I'm able to download the flux stuff through the model manager, after cleaning up what is left over after a failed attempt to import models fetched by a previous instance of InvokeAI. The model in my backup is identical to what is downloaded by the model manager; I think the reason the HF install works but the local install doesn't is probably related to some metadata that the model manager gets from HF that isn't part of a zipped models directory.
if you download the file by hand you can place it in the correct dir, also remember to rename it <name>.<extension> e.g. flux1-fill-dev.safetensors -> FLUX Fill.safetensors (use quotes as necessary)
https://github.com/invoke-ai/InvokeAI/blob/c5069557f385648005bcb76b3fdee9e29adc3f8f/invokeai/backend/model_manager/starter_models.py#L681
Then scan for models either everywhere or just in that dir, and it should now recognize it rather than just "unable to determine the model type"