DeepLearningExamples icon indicating copy to clipboard operation
DeepLearningExamples copied to clipboard

How to use exported Tacotron2 ONNX model?

Open serg06 opened this issue 3 years ago • 6 comments

I used the Tacotron2 -> ONNX export script: PyTorch/SpeechSynthesis/Tacotron2/exports/export_tacotron2_onnx.py

But it produced 3 separate files:

encoder.onnx
decoder_iter.onnx
postnet.onnx

How do we actually use these models with an onnx runtime?

serg06 avatar Jun 08 '21 18:06 serg06

You can use the ONNX models in a way similar to what is done in the test_inference function. Although the function operates on PyTorch models, the logic is the same - you need to pass the output of the encoder and the decoder inputs to a loop executing the decoder_iter as long as not_finished flag is unset. The output from the last execution of decoder_iter is then passed to postnet.

Check PyTorch tutorial for how to do inference with ONNX runtime.

ghost avatar Jul 06 '21 09:07 ghost

@GrzegorzKarchNV Thanks I got it working. Is there a reason it's split into 3 different models?

serg06 avatar Jul 06 '21 13:07 serg06

@GrzegorzKarchNV Thanks I got it working. Is there a reason it's split into 3 different models?

Can you show me how you got it to work? I'm trying


encoder = onnxruntime.InferenceSession("./out/encoder.onnx")

texts = ["Hello World, good day."]
sequences, sequence_lengths = prepare_input_sequence(texts)
encoder_inputs = {encoder.get_inputs()[0].name: to_numpy(sequences),
                  encoder.get_inputs()[1].name: to_numpy(sequence_lengths)}
encoder_outs = encoder.run(None, encoder_inputs)

but I'm getting this error that I can't solve!

onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running LSTM node. Name:'LSTM_28' Status Message: Invalid value/s in sequence_lens. All values must be > 0 and < seq_length. seq_length=22

NeilFranks avatar Oct 18 '21 05:10 NeilFranks

Can you show me how you got it to work? I'm trying


encoder = onnxruntime.InferenceSession("./out/encoder.onnx")

texts = ["Hello World, good day."]
sequences, sequence_lengths = prepare_input_sequence(texts)
encoder_inputs = {encoder.get_inputs()[0].name: to_numpy(sequences),
                  encoder.get_inputs()[1].name: to_numpy(sequence_lengths)}
encoder_outs = encoder.run(None, encoder_inputs)

but I'm getting this error that I can't solve!

onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running LSTM node. Name:'LSTM_28' Status Message: Invalid value/s in sequence_lens. All values must be > 0 and < seq_length. seq_length=22

I have the same error too, did you manage to fix it ?

TruscaPetre avatar Dec 13 '21 10:12 TruscaPetre

I have the same error here. Any updates?

martin3252 avatar Mar 04 '22 08:03 martin3252

Best i managed was feeding it longer inputs until it worked. No idea how that works. Would like to know an actual fix

NeilFranks avatar Mar 04 '22 08:03 NeilFranks