petals icon indicating copy to clipboard operation
petals copied to clipboard

IndexError: tuple index out of range

Open jaskirat8 opened this issue 2 years ago • 7 comments

To get bootstrapped, I tried to use the example from Readme

from transformers import AutoTokenizer
from petals import AutoDistributedModelForCausalLM
import torch 

# Choose any model available at https://health.petals.dev
model_name = "petals-team/StableBeluga2"  # This one is fine-tuned Llama 2 (70B)

# Connect to a distributed network hosting model layers
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoDistributedModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.float32)

# Run the model as if it were on your computer
inputs = tokenizer("A cat sat", return_tensors="pt")["input_ids"]
outputs = model.generate(input_ids=inputs, max_new_tokens=256)
print(tokenizer.decode(outputs[0]))  # A cat sat on a mat...

The torch_dtype=torch.float32 was added due to CPU support warning but apart from that rest is the same as the original example yet I am facing the error and unable to complete the inference.

Screenshot from 2023-10-25 10-30-08

OS : Ubuntu 22.04 CPU : i7-7700K GPU: Nvidia 1070

Please guide if i am missing something here.

jaskirat8 avatar Oct 25 '23 05:10 jaskirat8

Same error... this doesn't work anymore. Creators are also not active.

AIAnytime avatar Nov 05 '23 06:11 AIAnytime

+1

running into the same error while trying to generate by calling model.generate() method in the getting started colab notebook.

Screenshot 2023-11-05 at 1 11 24 PM

daspartho avatar Nov 05 '23 07:11 daspartho

found relevant issue huggingface/transformers#10160

daspartho avatar Nov 05 '23 07:11 daspartho

this seems to be an issue with the petals library itself instead of the transformers library since replacing AutoDistributedModelForCausalLM with AutoModelForCausalLM seems to work fine

daspartho avatar Nov 05 '23 09:11 daspartho

@daspartho, I have the same thoughts as I have been using the same models for months directly, and it works. I just wanted to validate that it is not something due to my misconfiguration or overlooking some setting; we need to isolate the culprit and work towards PR since other folks are also facing this.

jaskirat8 avatar Nov 05 '23 09:11 jaskirat8

yes i agree!

also gently pinging @borzunov here

daspartho avatar Nov 05 '23 09:11 daspartho

[working on it]

justheuristic avatar Nov 07 '23 12:11 justheuristic