models icon indicating copy to clipboard operation
models copied to clipboard

Inferencing with T5

Open ShivanshuPurohit opened this issue 4 years ago • 1 comments

Bug Report

Which model does this pertain to?

T5

Describe the bug

Running the script in colab

from onnxt5 import GenerativeT5
from onnxt5.api import get_encoder_decoder_tokenizer
decoder_sess, encoder_sess, tokenizer = get_encoder_decoder_tokenizer()
generative_t5 = GenerativeT5(encoder_sess, decoder_sess, tokenizer, onnx=True)
prompt = 'translate English to French: I was a victim of a series of accidents.'
output_text, output_logits = generative_t5(prompt, max_length=100, temperature=0.)

Results in image It doesn't give an error or anything, just stops execution like so.

Provide a code snippet to reproduce your errors. In google colab, run

!pip install onnxt5
from onnxt5 import GenerativeT5
from onnxt5.api import get_encoder_decoder_tokenizer
decoder_sess, encoder_sess, tokenizer = get_encoder_decoder_tokenizer()
generative_t5 = GenerativeT5(encoder_sess, decoder_sess, tokenizer, onnx=True)
prompt = 'translate English to French: I was a victim of a series of accidents.'
output_text, output_logits = generative_t5(prompt, max_length=100, temperature=0.)

ShivanshuPurohit avatar Feb 14 '21 08:02 ShivanshuPurohit

You are getting this error because you've set the max_length=100 and sent input_ids of length 18. That's why it stops at 18%. If you provide max_length as 18 you'll get 100%. Still this issue needs to be fixed.

Ki6an avatar Feb 24 '21 10:02 Ki6an