candle
candle copied to clipboard
Error During Tokenizer Initialization When Deploying SD 3.5 Large Turbo
I’m encountering an issue while deploying the Stable Diffusion 3.5 Large Turbo model. The error occurs after about 67% of the model components are downloaded successfully, but it fails during the tokenizer initialization phase, causing the backend worker process to die unexpectedly. Has anyone experienced something similar or have any suggestions on how to troubleshoot or fix this?
Here are the libraries in use: diffusers>=0.32.2 torchvision>=0.19.0 transformers>=4.43.2 accelerate>=0.34.0 safetensors>=0.4.3 torch>=2.4.0 pillow==11.1.0 sentencepiece==0.2.0 huggingface-hub[hf_transfer]==0.25.2 tokenizers==0.19.1
Thanks!