vits icon indicating copy to clipboard operation
vits copied to clipboard

Export the model to onnx format.

Open mudong0419 opened this issue 3 years ago • 24 comments

Thanks for your great work. I'v trained a VITS model, and it can synthesize very fluently, and inference very fast. However is it possible to export trained model into onnx format, so as to inference with onnxruntime even more faster? Thanks in advance.

mudong0419 avatar Jan 18 '22 12:01 mudong0419

I have exported to onnx:

onnxexp glance -m vits_baker_v1.onnx                                                                                           vits!?
Namespace(model='vits_baker_v1.onnx', subparser_name='glance', verbose=False, version=False)
Exploring on onnx model: vits_baker_v1.onnx
Model summary on: vits_baker_v1.onnx
-------------------------------------------
ir version: 7
opset_import: 11 
producer_name: pytorch
doc_string: 
all ops used: Gather,Constant,Mul,Transpose,Shape,Cast,Range,Unsqueeze,Less,Conv,Concat,Reshape,Div,MatMul,Sub,Add,ConstantOfShape,Slice,Pad,Equal,Where,Softmax,Pow,ReduceMean,Sqrt,Relu,Split,Erf,RandomNormalLike,Not,Greater,And,Expand,ScatterND,NonZero,GatherND,CumSum,Softplus,ReduceSum,GatherElements,Neg,Exp,Ceil,Clip,ReduceMax,If,Tanh,Sigmoid,LeakyRelu,ConvTranspose
-------------------------------------------

Summary: 
Name         Shape    Input/Output
-----------  -------  --------------
tst          [1, -1]  INPUT
tst_lengths  [1]      INPUT
output       [-1]     OUTPUT

image

lucasjinreal avatar Feb 06 '22 06:02 lucasjinreal

@jinfagang How did you succeed? I have the error like this :

Traceback (most recent call last): File "vits_model.py", line 118, in example_outputs=(vits_output),) File "/usr/local/lib/python3.6/dist-packages/torch/onnx/init.py", line 28, in _export result = utils._export(*args, **kwargs) File "/usr/local/lib/python3.6/dist-packages/torch/onnx/utils.py", line 530, in _export fixed_batch_size=fixed_batch_size) File "/usr/local/lib/python3.6/dist-packages/torch/onnx/utils.py", line 350, in _model_to_graph method_graph, params = torch._C._jit_pass_lower_graph(graph, model._c) RuntimeError: Unknown type None encountered in graph lowering. This type is not supported in ONNX export.

MaxMax2016 avatar Feb 07 '22 02:02 MaxMax2016

@jinfagang my code and model are here:https://github.com/dtx525942103/vits_chinese/issues/3

MaxMax2016 avatar Feb 07 '22 04:02 MaxMax2016

@dtx525942103 Did u get a None during inference? I did nothing particular just normal steps in export, sending a random text as input.

lucasjinreal avatar Feb 07 '22 05:02 lucasjinreal

@jinfagang Can you show the exact code you used to export?

ZDisket avatar Feb 20 '22 23:02 ZDisket

any update?

martin3252 avatar Mar 03 '22 06:03 martin3252

I have tried this, but no luck

input_dummy = torch.randint(0, 24, (batch_size, 128)).long().to(device)
input_lengths = torch.randint(100, 129, (batch_size,)).long().to(device)
input_lengths[-1] = 128
spec = torch.rand(batch_size, filter_length // 2 + 1, 30).to(device)
spec_lengths = torch.randint(20, 30, (batch_size,)).long().to(device)
spec_lengths[-1] = spec.size(2)


dummy_input = (input_dummy, input_lengths, spec, spec_lengths)

#ONNX
torch.onnx.export(net_g, dummy_input, "net_g.onnx", verbose=True, 
                    export_params=True, opset_version=14)

NeonBohdan avatar May 18 '22 00:05 NeonBohdan

@jinfagang could you share your export config code?

Aloento avatar Aug 20 '22 22:08 Aloento