ibrahim737701

Results 4 issues of ibrahim737701

I'm trying to load databricks/dolly-v2-3b using the following code import torch from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("databricks/dolly-v2-3b", padding_side="left") model = AutoModelForCausalLM.from_pretrained("databricks/dolly-v2-3b", device_map="auto", trust_remote_code=True, torch_dtype=torch.bfloat16) And it's giving the...

``` OSError: Unable to load weights from pytorch checkpoint file for 'C:\Users\mohammad.ibrahim-st/.cache\huggingface\hub\models--databricks--dolly-v2-3b\snapshots\9c82082015ad144fe64317dae8b0d6e4ee78a732\pytorch_model.bin' at 'C:\Users\mohammad.ibrahim-st/.cache\huggingface\hub\models--databricks--dolly-v2-3b\snapshots\9c82082015ad144fe64317dae8b0d6e4ee78a732\pytorch_model.bin'. If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True....

import torch from instruct_pipeline import InstructionTextGenerationPipeline from transformers import AutoModelForCausalLM, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained("databricks/dolly-v2-3b", padding_side="left") model = AutoModelForCausalLM.from_pretrained("databricks/dolly-v2-3b", device_map="auto", torch_dtype=torch.bfloat16, from_tf = True) generate_text = InstructionTextGenerationPipeline(model=model, tokenizer=tokenizer)

Even after adding environment variables like:- os.environ['RANK'] = '0' os.environ['WORLD_SIZE'] = '4' os.environ['MASTER_ADDR'] = 'localhost' os.environ['MASTER_PORT'] = '12345' the mode itself is not loading.