BRIO
BRIO copied to clipboard
Unable to generate summary when initializing model with PyTorch
Hello, I've tried initializing the model using the provided example in the README:
model = BRIO('Yale-LILY/brio-cnndm-uncased', tok.pad_token_id, is_pegasus=False)
However I've been facing some issues when trying to use it for inference:
- I keep getting issues that parameters used by
.generate()method have value ofNone. I've tried just putting some default values as shown here. Here's how it looks :
inputs = tokenizer([article], max_length=max_length, return_tensors="pt", truncation=True)
summary_ids = model.generate(inputs["input_ids"],
early_stopping=False,
max_length=1024,
num_beams=1,
num_beam_groups=1)
- This brings me to my second issue, where the
.generate()method does not create any ids. Whenever I try to decode the generated summary, I get the errorTypeError: 'NoneType' object is not iterable. When checking the type or content ofsummary_ids, I getNoneor<class 'NoneType'>.
Why is this happening? When loading the pre-trained models straight from HF, I don't have any issues but this one does not seem to be working.
It's the problem with transformers's version, just use transformer==4.24.0 or anything bellow would work.