shamin10

Results 1 comments of shamin10

Thank you I'm trying to use bling-sheared-llama-1.3b-0.1 I have downloaded this model to my PC. I want to use # Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer =...