models
models copied to clipboard
Inferencing with T5
Bug Report
Which model does this pertain to?
T5
Describe the bug
Running the script in colab
from onnxt5 import GenerativeT5
from onnxt5.api import get_encoder_decoder_tokenizer
decoder_sess, encoder_sess, tokenizer = get_encoder_decoder_tokenizer()
generative_t5 = GenerativeT5(encoder_sess, decoder_sess, tokenizer, onnx=True)
prompt = 'translate English to French: I was a victim of a series of accidents.'
output_text, output_logits = generative_t5(prompt, max_length=100, temperature=0.)
Results in
It doesn't give an error or anything, just stops execution like so.
Provide a code snippet to reproduce your errors. In google colab, run
!pip install onnxt5
from onnxt5 import GenerativeT5
from onnxt5.api import get_encoder_decoder_tokenizer
decoder_sess, encoder_sess, tokenizer = get_encoder_decoder_tokenizer()
generative_t5 = GenerativeT5(encoder_sess, decoder_sess, tokenizer, onnx=True)
prompt = 'translate English to French: I was a victim of a series of accidents.'
output_text, output_logits = generative_t5(prompt, max_length=100, temperature=0.)
You are getting this error because you've set the max_length=100
and sent input_ids
of length 18. That's why it stops at 18%. If you provide max_length as 18 you'll get 100%. Still this issue needs to be fixed.