vits
vits copied to clipboard
Export the model to onnx format.
Thanks for your great work. I'v trained a VITS model, and it can synthesize very fluently, and inference very fast. However is it possible to export trained model into onnx format, so as to inference with onnxruntime even more faster? Thanks in advance.
I have exported to onnx:
onnxexp glance -m vits_baker_v1.onnx vits!?
Namespace(model='vits_baker_v1.onnx', subparser_name='glance', verbose=False, version=False)
Exploring on onnx model: vits_baker_v1.onnx
Model summary on: vits_baker_v1.onnx
-------------------------------------------
ir version: 7
opset_import: 11
producer_name: pytorch
doc_string:
all ops used: Gather,Constant,Mul,Transpose,Shape,Cast,Range,Unsqueeze,Less,Conv,Concat,Reshape,Div,MatMul,Sub,Add,ConstantOfShape,Slice,Pad,Equal,Where,Softmax,Pow,ReduceMean,Sqrt,Relu,Split,Erf,RandomNormalLike,Not,Greater,And,Expand,ScatterND,NonZero,GatherND,CumSum,Softplus,ReduceSum,GatherElements,Neg,Exp,Ceil,Clip,ReduceMax,If,Tanh,Sigmoid,LeakyRelu,ConvTranspose
-------------------------------------------
Summary:
Name Shape Input/Output
----------- ------- --------------
tst [1, -1] INPUT
tst_lengths [1] INPUT
output [-1] OUTPUT

@jinfagang How did you succeed? I have the error like this :
Traceback (most recent call last):
File "vits_model.py", line 118, in
@jinfagang my code and model are here:https://github.com/dtx525942103/vits_chinese/issues/3
@dtx525942103 Did u get a None during inference? I did nothing particular just normal steps in export, sending a random text as input.
@jinfagang Can you show the exact code you used to export?
any update?
I have tried this, but no luck
input_dummy = torch.randint(0, 24, (batch_size, 128)).long().to(device)
input_lengths = torch.randint(100, 129, (batch_size,)).long().to(device)
input_lengths[-1] = 128
spec = torch.rand(batch_size, filter_length // 2 + 1, 30).to(device)
spec_lengths = torch.randint(20, 30, (batch_size,)).long().to(device)
spec_lengths[-1] = spec.size(2)
dummy_input = (input_dummy, input_lengths, spec, spec_lengths)
#ONNX
torch.onnx.export(net_g, dummy_input, "net_g.onnx", verbose=True,
export_params=True, opset_version=14)
@jinfagang could you share your export config code?