werruww
werruww
not run on colab t4
import torch from diffusers import FluxTransformer2DModel, FluxPipeline model_id = "black-forest-labs/FLUX.1-dev" nf4_id = "sayakpaul/flux.1-dev-nf4-with-bnb-integration" model_nf4 = FluxTransformer2DModel.from_pretrained(nf4_id, torch_dtype=torch.bfloat16) print(model_nf4.dtype) print(model_nf4.config.quantization_config) pipe = FluxPipeline.from_pretrained(model_id, transformer=model_nf4, torch_dtype=torch.bfloat16) #pipe.enable_model_cpu_offload() #pipe.enable_sequential_cpu_offload() prompt = "A mystic...
just create your token on Hugginface how?????
my Access Tokens ??????
https://github.com/werruww/transformersjs/blob/main/Untitled32%20(1).ipynb
??????????????????
from airllm import AutoModel import torch MAX_LENGTH = 15 # could use hugging face model repo id: model = AutoModel.from_pretrained("unsloth/Llama-3.1-Nemotron-70B-Instruct-bnb-4bit", delete_original=True) input_text = [ 'What is the capital of United...
>???
how install transformers==4.40.0.dev0