models icon indicating copy to clipboard operation
models copied to clipboard

Why exported gpt2.onnx has 1 inputs?

Open shp776 opened this issue 4 years ago • 1 comments

Ask a Question

Question

I have two questions.

  1. https://github.com/onnx/models/tree/master/text/machine_comprehension/gpt-2 The onnx file extracted from the link above appears to have only one input which is input_ids. Therefore, position_ids and attention_mask inputs seem to be not used. Is there any problem with this one_input_onnx_file performing text generation? I am asking you because the onnx_file used in the link below has 15 inputs.

https://github.com/microsoft/onnxruntime/blob/master/onnxruntime/python/tools/transformers/notebooks/Inference_GPT2_with_OnnxRuntime_on_CPU.ipynb

  1. You mentioned text generation code as below capture. image

Using the code you provided, text generation worked properly up to one sentence, but continuous sentence generation did not work properly. I wonder if this is because of the structure of the onnx model we used that does not handle past_inputs.

Additionally, the above code differs from the original sample.py code. Only dealing with outputs[0] and torch.multinomial(sampling) and top_k algorithms are missing. Is this due to the onnx model structure that considers only one input (not considering past_state) ?

Further information

Relevant Area (e.g. model usage, backend, best practices, pre-/post- processing, converters):

Is this issue related to a specific model?
Model name (e.g. mnist):
Model opset (e.g. 7):

Notes

Any additional information, code snippets.

shp776 avatar Aug 10 '21 18:08 shp776

check there for for a complete GPT-2 model inference example https://github.com/microsoft/onnxruntime-extensions/blob/main/tutorials/gpt2bs.py

wenbingl avatar Sep 13 '21 17:09 wenbingl