Error when trying to load Mistral-Small-3.2-24B-Instruct-2506
When trying to load "Mistral-Small-3.2-24B-Instruct-2506" I get the following error.
`04:23:50-340552 INFO Loading "mistralai_Mistral-Small-3.2-24B-Instruct-2506" 04:24:10-615718 INFO TRANSFORMERS_PARAMS= { 'low_cpu_mem_usage': True, 'torch_dtype': torch.bfloat16, 'attn_implementation': 'sdpa', 'device_map': 'auto', 'max_memory': {0: '16GiB', 1: '16GiB', 2: '16GiB', 3: '16GiB'}}
04:24:10-965723 ERROR Failed to load the model. Traceback (most recent call last): File "E:\text-generation-webui-3.7.1\text-generation-webui-3.7.1\modules\ui_model_menu.py", line 198, in load_model_wrapper shared.model, shared.tokenizer = load_model(selected_model, loader) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\text-generation-webui-3.7.1\text-generation-webui-3.7.1\modules\models.py", line 42, in load_model output = load_func_maploader ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\text-generation-webui-3.7.1\text-generation-webui-3.7.1\modules\models.py", line 82, in transformers_loader return load_model_HF(model_name) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\text-generation-webui-3.7.1\text-generation-webui-3.7.1\modules\transformers_loader.py", line 259, in load_model_HF model = LoaderClass.from_pretrained(path_to_model, **params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\text-generation-webui-3.7.1\text-generation-webui-3.7.1\installer_files\env\Lib\site-packages\transformers\models\auto\auto_factory.py", line 603, in from_pretrained raise ValueError( ValueError: Unrecognized configuration class <class 'transformers.models.mistral3.configuration_mistral3.Mistral3Config'> for this kind of AutoModel: AutoModelForCausalLM. Model type should be one of ArceeConfig, AriaTextConfig, BambaConfig, BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BitNetConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, LlamaConfig, CodeGenConfig, CohereConfig, Cohere2Config, CpmAntConfig, CTRLConfig, Data2VecTextConfig, DbrxConfig, DeepseekV3Config, DiffLlamaConfig, Dots1Config, ElectraConfig, Emu3Config, ErnieConfig, FalconConfig, FalconH1Config, FalconMambaConfig, FuyuConfig, GemmaConfig, Gemma2Config, Gemma3Config, Gemma3TextConfig, Gemma3nConfig, Gemma3nTextConfig, GitConfig, GlmConfig, Glm4Config, GotOcr2Config, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, GraniteConfig, GraniteMoeConfig, GraniteMoeHybridConfig, GraniteMoeSharedConfig, HeliumConfig, JambaConfig, JetMoeConfig, LlamaConfig, Llama4Config, Llama4TextConfig, MambaConfig, Mamba2Config, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MiniMaxConfig, MistralConfig, MixtralConfig, MllamaConfig, MoshiConfig, MptConfig, MusicgenConfig, MusicgenMelodyConfig, MvpConfig, NemotronConfig, OlmoConfig, Olmo2Config, OlmoeConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PersimmonConfig, PhiConfig, Phi3Config, Phi4MultimodalConfig, PhimoeConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, Qwen2Config, Qwen2MoeConfig, Qwen3Config, Qwen3MoeConfig, RecurrentGemmaConfig, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, SmolLM3Config, Speech2Text2Config, StableLmConfig, Starcoder2Config, TransfoXLConfig, TrOCRConfig, WhisperConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig, ZambaConfig, Zamba2Config.`
I'm running on Windows 11 Pro 24h2, 128gb ram, 4xRTX4060ti 16gb. please let me know if you need any further info any help would be greatly appreciated
edit: I also tried "unsloth/Mistral-Small-3.2-24B-Instruct-2506" and got the same error
I am facing the below issue with this model.
RuntimeError: Failed to load a HF tokenizer for mistralai/Mistral-Small-3.2-24B-Instruct-2506: <class 'transformers.models.mistral3.configuration_mistral3.Mistral3Config'>
I tested with 3.10 and I'm still getting this error. The HF model page says it needs "Mistral3ForConditionalGeneration" and mistral-common >= 1.6.2.
I'm not a coder but I tried updating the Transformers in the cli to version 4.56.0.dev0 in hopes that it would install "Mistral3ForConditionalGeneration" and installed mistral common 1.6.2 but it didn't help. Any Ideas would be appreciated.
didnt mean to close issue