Paddle2ONNX icon indicating copy to clipboard operation
Paddle2ONNX copied to clipboard

Converting SRN model from PaddleOCR to ONNX

Open susanin1970 opened this issue 3 years ago • 38 comments

Hi! Thanks for this great repo :)

I trained SRN model for text recognition from PaddleOCR framework on my own dataset and tranflated to the inference format
Then I tried to convert SRN inference model to the ONNX

I used following command:

paddle2onnx --model_dir D:\Repositories\PaddleOCR\output\rec\srn\inference_model  --model_filename D:\Repositories\PaddleOCR\output\rec\srn\inference_model\inference.pdmodel --params_filename D:\Repositories\PaddleOCR\output\rec\srn\inference_model\inference.pdiparams --save_file C:\Users\Reutov\Desktop\SRN_iso_containers.onnx --opset_version 11 --enable_onnx_checker True  

Then when I try to make inference ONNX model via onnxruntime, I get following error:

Traceback (most recent call last):
  File ".\paddle_ocr_onnx_test.py", line 33, in <module>
    outputs = session.run(None, {input_name: number_image})
  File "C:\Users\Reutov\anaconda3\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 184, in run
    raise ValueError("Model requires {} inputs. Input Feed contains {}".format(num_required_inputs, num_inputs))
ValueError: Model requires 5 inputs. Input Feed contains 1

I checked ONNX graph of SRN model by using Netron:
image

It can be seen that the model has 5 inputs, one of which is responsible for the input image tensor (x), and the purpose of all the others is still unknown to me

Please tell me what should I do to execute the correct SRN model inference via onnxruntime Thanks in advance for reply)

susanin1970 avatar Oct 07 '21 09:10 susanin1970

Have you used paddle to load the model you stored to inference and get the correct result? Please provide your model to help analyze, thank you.

yeliang2258 avatar Oct 08 '21 03:10 yeliang2258

Yes, I checked SRN model after converting to the inference formar with use paddle

I use predict_rec.py script in PaddleOCR repository for getting predictions:

python .\tools\infer\predict_rec.py --rec_model_dir=D:\Repositories\PaddleOCR\output\rec\srn\inference_model --rec_algorithm=SRN --rec_image_shape="1,64,256" --rec_char_type=en --image_dir=C:\Users\Reutov\Desktop\test_iso_numbers --use_gpu=True

I also made the model inference directly from Python code:


rec_algorithm = 'SRN'
    ocr = PaddleOCR(
        rec_model_dir=r"D:\Repositories\PaddleOCR\output\rec\srn\inference_model",
        rec_char_type="en",
        rec_image_shape="1,64,256",
        use_gpu=True,
        det=False,
        cls=False,
        rec_algorithm=rec_algorithm
    )

And I also got predictions
Link to tke model: https://drive.google.com/file/d/10dlXJwmmr4dmG8OStbZmRO6Yvc5O8dwb/view?usp=sharing

susanin1970 avatar Oct 08 '21 09:10 susanin1970

Please provide your paddle inference model, thanks.

yeliang2258 avatar Oct 08 '21 11:10 yeliang2258

Use netron to see your onnx model, there are indeed multiple inputs. you can use netron to see your paddle inference model too. Check if the paddle model has multiple inputs. image image image image

yeliang2258 avatar Oct 08 '21 12:10 yeliang2258

Link on my SRN inference model: https://drive.google.com/file/d/1nYDrG9zDO6Sjo7oYvN57mmmwy3jNzGL7/view?usp=sharing

susanin1970 avatar Oct 08 '21 12:10 susanin1970

Yes, paddle inference model has multiple inputs:
image

image

image

image

image

susanin1970 avatar Oct 08 '21 12:10 susanin1970

Yes, paddle inference model has multiple inputs: image

image

image

image

image

So using onnxruntime for inference should also require multiple inputs.

yeliang2258 avatar Oct 08 '21 12:10 yeliang2258

So, I need to generate 5 inputs One of these inputs will be an image, and all the other inputs can I randomly generate for getting predictions? Or is it better for me to create a ticket in the PaddleOCR repository with such a question?

susanin1970 avatar Oct 08 '21 12:10 susanin1970

You can read the inference scripts PaddleOCR provided to see the specific input, or you can go to the repository of PaddleOCR to raise related issues.

yeliang2258 avatar Oct 08 '21 12:10 yeliang2258

Hey @susanin1970 I'm working on a similar problem. I was wondering if you could share the code for inference of the onnx models using ORT.

divya1211 avatar Nov 02 '21 13:11 divya1211

Hey, @divya1211, I published similar ticket in PaddleOCR repo: https://github.com/PaddlePaddle/PaddleOCR/issues/4267
I was told there about where to find code to generate inputs for the SRN model

And I wrote script for testing SRN in ONNX format with use code for generating inputs: srn_onnx_paddle_ocr_test.zip

It takes the path to SRN ONNX model, path to dict file (check format here, paragraph Dictionary) and path to txt file of custom dataset (check format here, paragraph Costom Dataset)

Hope this helps you :)

susanin1970 avatar Nov 10 '21 17:11 susanin1970

Hey, @divya1211, I published similar ticket in PaddleOCR repo: PaddlePaddle/PaddleOCR#4267 I was told there about where to find code to generate inputs for the SRN model

And I wrote script for testing SRN in ONNX format with use code for generating inputs: srn_onnx_paddle_ocr_test.zip

It takes the path to SRN ONNX model, path to dict file (check format here, paragraph Dictionary) and path to txt file of custom dataset (check format here, paragraph Costom Dataset)

Hope this helps you :)

Hello, After converting from inference model to onnx file. How can i test on my data. I used your provided code ,but it give error as below. image

Ehteshamciitwah avatar Nov 24 '21 05:11 Ehteshamciitwah

https://iwenjuan.baidu.com/?code=r8hu2s 我们准备了一份调研问卷,如有时间可参与调研帮助Paddle2ONNX做得更好,谢谢!

jiangjiajun avatar Nov 25 '21 06:11 jiangjiajun

Hey, @divya1211, I published similar ticket in PaddleOCR repo: PaddlePaddle/PaddleOCR#4267 I was told there about where to find code to generate inputs for the SRN model And I wrote script for testing SRN in ONNX format with use code for generating inputs: srn_onnx_paddle_ocr_test.zip It takes the path to SRN ONNX model, path to dict file (check format here, paragraph Dictionary) and path to txt file of custom dataset (check format here, paragraph Costom Dataset) Hope this helps you :)

Hello, After converting from inference model to onnx file. How can i test on my data. I used your provided code ,but it give error as below. image

This is probably due to the fact that you have a slightly different txt file with the alphabet which has less than 39 characters I use the standard en_dict.txt: en_dict.txt

Logic of decode_output function relise on use this en_dict.txt file

susanin1970 avatar Nov 25 '21 12:11 susanin1970

Check your raw output of predictions:

predict_tensor    = self.srn_recognizer.run(None,
                {
                    x_name: x,
                    data_0_name: data_0.astype("int64"),
                    data_1_name: data_1.astype("int64"),
                    data_2_name: data_2.astype("int64"),
                    data_3_name: data_3.astype("int64")

                }
        )
print(predict_tensor[0])

In my case raw output looks like this:

[[12]
 [10]
 [18]
 [30]
 [ 7]
 [ 9]
 [ 3]
 [ 5]
 [ 6]
 [ 5]
 [ 6]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]]

First 11 elements are indexes of recognized symbols in en_dict.txt. This is latin letters in lower case
decode_output function takes from this raw list all indexes except 37

I assume that element 37 is a stub, because in my task the line can have a maximum of 11 characters, while in the SRN configuration YAML file the max_text_length parameter defaults to 25

If I output the length of raw outputs:

print(len(predict_tensor[0]))

I'll get the following:

25

This is max text length of SRN model by default
When I try to change this value, training doesn't start and I getting error

susanin1970 avatar Nov 25 '21 12:11 susanin1970

Check your raw output of predictions:

predict_tensor    = self.srn_recognizer.run(None,
                {
                    x_name: x,
                    data_0_name: data_0.astype("int64"),
                    data_1_name: data_1.astype("int64"),
                    data_2_name: data_2.astype("int64"),
                    data_3_name: data_3.astype("int64")

                }
        )
print(predict_tensor[0])

In my case raw output looks like this:

[[12]
 [10]
 [18]
 [30]
 [ 7]
 [ 9]
 [ 3]
 [ 5]
 [ 6]
 [ 5]
 [ 6]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]]

First 11 elements are indexes of recognized symbols in en_dict.txt. This is latin letters in lower case decode_output function takes from this raw list all indexes except 37

I assume that element 37 is a stub, because in my task the line can have a maximum of 11 characters, while in the SRN configuration YAML file the max_text_length parameter defaults to 25

If I output the length of raw outputs:

print(len(predict_tensor[0]))

I'll get the following:

25

This is max text length of SRN model by default When I try to change this value, training doesn't start and I getting error

Thank you very much for your time. Can i implement algorithm Rosetta same as SRN for recognition .

Ehteshamciitwah avatar Nov 26 '21 01:11 Ehteshamciitwah

Check your raw output of predictions:

predict_tensor    = self.srn_recognizer.run(None,
                {
                    x_name: x,
                    data_0_name: data_0.astype("int64"),
                    data_1_name: data_1.astype("int64"),
                    data_2_name: data_2.astype("int64"),
                    data_3_name: data_3.astype("int64")

                }
        )
print(predict_tensor[0])

In my case raw output looks like this:

[[12]
 [10]
 [18]
 [30]
 [ 7]
 [ 9]
 [ 3]
 [ 5]
 [ 6]
 [ 5]
 [ 6]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]
 [37]]

First 11 elements are indexes of recognized symbols in en_dict.txt. This is latin letters in lower case decode_output function takes from this raw list all indexes except 37 I assume that element 37 is a stub, because in my task the line can have a maximum of 11 characters, while in the SRN configuration YAML file the max_text_length parameter defaults to 25 If I output the length of raw outputs:

print(len(predict_tensor[0]))

I'll get the following:

25

This is max text length of SRN model by default When I try to change this value, training doesn't start and I getting error

Thank you very much for your time. Can i implement algorithm Rosetta same as SRN for recognition .

Judging by the PaddleOCR documentation, you have the ability to add your own algorithms Check this link: https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/doc/doc_en/add_new_algorithm_en.md

susanin1970 avatar Nov 28 '21 19:11 susanin1970

And there is already an implementation of Rosseta in PaddleOCR: https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/doc/doc_en/algorithm_overview_en.md

Or do you wish to convert Rosetta to ONNX in the same way as SRN?

susanin1970 avatar Nov 28 '21 19:11 susanin1970

I implement Rosetta and finetune on my data. But accuracy of Rosetta is less than SRN.

I use your code for the implementation of SRN. After changing my dataset Format, the SRN_onnx code is running Fine.

However, if I convert rosetta into ONNX format, how can I verify Rosetta_onnx.

On Mon, 29 Nov 2021 at 04:55, susanin1970 @.***> wrote:

And there is already an implementation of Rosseta in PaddleOCR: https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/doc/doc_en/algorithm_overview_en.md

Or do you wish to convert Rosetta to ONNX in the same way as SRN?

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/PaddlePaddle/Paddle2ONNX/issues/364#issuecomment-981141991, or unsubscribe https://github.com/notifications/unsubscribe-auth/AS4FTSSO6OXQE6AMMW26YQ3UOKCMFANCNFSM5FQ3NXDA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

-- Best Regards, Ehtesham Iqbal Research Assistant CAU Seoul

Ehteshamciitwah avatar Nov 29 '21 00:11 Ehteshamciitwah

I implement Rosetta and finetune on my data. But accuracy of Rosetta is less than SRN. I use your code for the implementation of SRN. After changing my dataset Format, the SRN_onnx code is running Fine. However, if I convert rosetta into ONNX format, how can I verify Rosetta_onnx. On Mon, 29 Nov 2021 at 04:55, susanin1970 @.***> wrote: And there is already an implementation of Rosseta in PaddleOCR: https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/doc/doc_en/algorithm_overview_en.md Or do you wish to convert Rosetta to ONNX in the same way as SRN? — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#364 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AS4FTSSO6OXQE6AMMW26YQ3UOKCMFANCNFSM5FQ3NXDA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub. -- Best Regards, Ehtesham Iqbal Research Assistant CAU Seoul

I think, you must check input shape of Rosetta model (you can use Netron for that: https://github.com/lutzroeder/netron), bring your image with text to this shape (that is, you need to perform preprocessing) and run onnxruntime-session

I do these steps in my script

susanin1970 avatar Nov 29 '21 07:11 susanin1970

I implement Rosetta and finetune on my data. But accuracy of Rosetta is less than SRN. I use your code for the implementation of SRN. After changing my dataset Format, the SRN_onnx code is running Fine. However, if I convert rosetta into ONNX format, how can I verify Rosetta_onnx. On Mon, 29 Nov 2021 at 04:55, susanin1970 @.***> wrote: And there is already an implementation of Rosseta in PaddleOCR: https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/doc/doc_en/algorithm_overview_en.md Or do you wish to convert Rosetta to ONNX in the same way as SRN? — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#364 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AS4FTSSO6OXQE6AMMW26YQ3UOKCMFANCNFSM5FQ3NXDA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub. -- Best Regards, Ehtesham Iqbal Research Assistant CAU Seoul

I think, you must check input shape of Rosetta model (you can use Netron for that: https://github.com/lutzroeder/netron), bring your image with text to this shape (that is, you need to perform preprocessing) and run onnxruntime-session

I do these steps in my script

Thank you for your continuous effort. I just want to ask one more thing. Can I extract only the SRN module from the PaddleOCR and then save the model as a Pth file rather than pdparams. Thank you

Ehteshamciitwah avatar Nov 30 '21 01:11 Ehteshamciitwah

I implement Rosetta and finetune on my data. But accuracy of Rosetta is less than SRN. I use your code for the implementation of SRN. After changing my dataset Format, the SRN_onnx code is running Fine. However, if I convert rosetta into ONNX format, how can I verify Rosetta_onnx. On Mon, 29 Nov 2021 at 04:55, susanin1970 @.***> wrote: And there is already an implementation of Rosseta in PaddleOCR: https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/doc/doc_en/algorithm_overview_en.md Or do you wish to convert Rosetta to ONNX in the same way as SRN? — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#364 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AS4FTSSO6OXQE6AMMW26YQ3UOKCMFANCNFSM5FQ3NXDA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub. -- Best Regards, Ehtesham Iqbal Research Assistant CAU Seoul

I think, you must check input shape of Rosetta model (you can use Netron for that: https://github.com/lutzroeder/netron), bring your image with text to this shape (that is, you need to perform preprocessing) and run onnxruntime-session I do these steps in my script

Thank you for your continuous effort. I just want to ask one more thing. Can I extract only the SRN module from the PaddleOCR and then save the model as a Pth file rather than pdparams. Thank you

I think there must be converters from .pdparams to .pth, but I'm not sure it is

susanin1970 avatar Nov 30 '21 08:11 susanin1970

Hello, Thank you for your reply. I trained SRN module and convert into inference and ONNX format. Now for deployment i need tensorRT support. How can i convert and test SRN inferernce/SRN Onnx into tensorRT. I am looking for your kind response.

On Tue, 30 Nov 2021 at 17:01, susanin1970 @.***> wrote:

I implement Rosetta and finetune on my data. But accuracy of Rosetta is less than SRN. I use your code for the implementation of SRN. After changing my dataset Format, the SRN_onnx code is running Fine. However, if I convert rosetta into ONNX format, how can I verify Rosetta_onnx. … <#m_2749371391359215943_> On Mon, 29 Nov 2021 at 04:55, susanin1970 @.***> wrote: And there is already an implementation of Rosseta in PaddleOCR: https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/doc/doc_en/algorithm_overview_en.md Or do you wish to convert Rosetta to ONNX in the same way as SRN? — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#364 (comment) https://github.com/PaddlePaddle/Paddle2ONNX/issues/364#issuecomment-981141991>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AS4FTSSO6OXQE6AMMW26YQ3UOKCMFANCNFSM5FQ3NXDA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub . -- Best Regards, Ehtesham Iqbal Research Assistant CAU Seoul

I think, you must check input shape of Rosetta model (you can use Netron for that: https://github.com/lutzroeder/netron), bring your image with text to this shape (that is, you need to perform preprocessing) and run onnxruntime-session I do these steps in my script

Thank you for your continuous effort. I just want to ask one more thing. Can I extract only the SRN module from the PaddleOCR and then save the model as a Pth file rather than pdparams. Thank you

I think there must be converters from .pdparams to .pth, but I'm not sure it is

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/PaddlePaddle/Paddle2ONNX/issues/364#issuecomment-982376601, or unsubscribe https://github.com/notifications/unsubscribe-auth/AS4FTSURTOOIVQEAADMWRQ3UOSAHNANCNFSM5FQ3NXDA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

-- Best Regards, Ehtesham Iqbal Research Assistant CAU Seoul

Ehteshamciitwah avatar Dec 01 '21 05:12 Ehteshamciitwah

Hello, Thank you for your reply. I trained SRN module and convert into inference and ONNX format. Now for deployment i need tensorRT support. How can i convert and test SRN inferernce/SRN Onnx into tensorRT. I am looking for your kind response. On Tue, 30 Nov 2021 at 17:01, susanin1970 @.> wrote: I implement Rosetta and finetune on my data. But accuracy of Rosetta is less than SRN. I use your code for the implementation of SRN. After changing my dataset Format, the SRN_onnx code is running Fine. However, if I convert rosetta into ONNX format, how can I verify Rosetta_onnx. … <#m_2749371391359215943_> On Mon, 29 Nov 2021 at 04:55, susanin1970 @.> wrote: And there is already an implementation of Rosseta in PaddleOCR: https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/doc/doc_en/algorithm_overview_en.md Or do you wish to convert Rosetta to ONNX in the same way as SRN? — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#364 (comment) <#364 (comment)>>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AS4FTSSO6OXQE6AMMW26YQ3UOKCMFANCNFSM5FQ3NXDA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub . -- Best Regards, Ehtesham Iqbal Research Assistant CAU Seoul I think, you must check input shape of Rosetta model (you can use Netron for that: https://github.com/lutzroeder/netron), bring your image with text to this shape (that is, you need to perform preprocessing) and run onnxruntime-session I do these steps in my script Thank you for your continuous effort. I just want to ask one more thing. Can I extract only the SRN module from the PaddleOCR and then save the model as a Pth file rather than pdparams. Thank you I think there must be converters from .pdparams to .pth, but I'm not sure it is — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#364 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AS4FTSURTOOIVQEAADMWRQ3UOSAHNANCNFSM5FQ3NXDA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub. -- Best Regards, Ehtesham Iqbal Research Assistant CAU Seoul

I have not tried to run the SRN inference via TensorRT, but I can try to do this experiment

susanin1970 avatar Dec 01 '21 10:12 susanin1970

I tried to convert pretrained SRN model with actual Paddle2ONNX version -- v 1.0.1:

paddle2onnx --model_dir D:\Repositories\PaddleOCR\inference_models\srn_pretrained --model_filename inference.pdmodel --params_filename inference.pdiparams --save_file C:\Users\Reutov\Desktop\srn-pretrained.onnx --opset_version 11 --enable_onnx_checker True

And I got follow error:

[Paddle2ONNX] Start to parse PaddlePaddle model...
[Paddle2ONNX] Model file path: D:\Repositories\PaddleOCR\inference_models\srn_pretrained\inference.pdmodel
[Paddle2ONNX] Paramters file path: D:\Repositories\PaddleOCR\inference_models\srn_pretrained\inference.pdiparams
[Paddle2ONNX] Start to parsing Paddle model...
[ERROR][Paddle2ONNX] [pad3d: pad3d_0.tmp_0] NDHWC format is not supported.
[Paddle2ONNX] Due to the operator: pad3d, this model cannot be exported to ONNX.
[ERROR][Paddle2ONNX] [pad3d: pad3d_1.tmp_0] NDHWC format is not supported.
[Paddle2ONNX] Due to the operator: pad3d, this model cannot be exported to ONNX.
[ERROR] Model exporting failed, you can report this problem to https://github.com/PaddlePaddle/Paddle2ONNX.git.

I used to be able to convert SRN to ONNX without any problems
What could be the problem?

susanin1970 avatar Oct 28 '22 09:10 susanin1970

Hi @susanin1970, I faced the same issue, did you managed to convert the model to ONNX? Or perhaps to any other inference model?

I tried to convert pretrained SRN model with actual Paddle2ONNX version -- v 1.0.1:

paddle2onnx --model_dir D:\Repositories\PaddleOCR\inference_models\srn_pretrained --model_filename inference.pdmodel --params_filename inference.pdiparams --save_file C:\Users\Reutov\Desktop\srn-pretrained.onnx --opset_version 11 --enable_onnx_checker True

And I got follow error:

[Paddle2ONNX] Start to parse PaddlePaddle model...
[Paddle2ONNX] Model file path: D:\Repositories\PaddleOCR\inference_models\srn_pretrained\inference.pdmodel
[Paddle2ONNX] Paramters file path: D:\Repositories\PaddleOCR\inference_models\srn_pretrained\inference.pdiparams
[Paddle2ONNX] Start to parsing Paddle model...
[ERROR][Paddle2ONNX] [pad3d: pad3d_0.tmp_0] NDHWC format is not supported.
[Paddle2ONNX] Due to the operator: pad3d, this model cannot be exported to ONNX.
[ERROR][Paddle2ONNX] [pad3d: pad3d_1.tmp_0] NDHWC format is not supported.
[Paddle2ONNX] Due to the operator: pad3d, this model cannot be exported to ONNX.
[ERROR] Model exporting failed, you can report this problem to https://github.com/PaddlePaddle/Paddle2ONNX.git.

I used to be able to convert SRN to ONNX without any problems What could be the problem?

flipson avatar Nov 30 '22 15:11 flipson

Hi @susanin1970, I faced the same issue, did you managed to convert the model to ONNX? Or perhaps to any other inference model?

I tried to convert pretrained SRN model with actual Paddle2ONNX version -- v 1.0.1: paddle2onnx --model_dir D:\Repositories\PaddleOCR\inference_models\srn_pretrained --model_filename inference.pdmodel --params_filename inference.pdiparams --save_file C:\Users\Reutov\Desktop\srn-pretrained.onnx --opset_version 11 --enable_onnx_checker True And I got follow error:

[Paddle2ONNX] Start to parse PaddlePaddle model...
[Paddle2ONNX] Model file path: D:\Repositories\PaddleOCR\inference_models\srn_pretrained\inference.pdmodel
[Paddle2ONNX] Paramters file path: D:\Repositories\PaddleOCR\inference_models\srn_pretrained\inference.pdiparams
[Paddle2ONNX] Start to parsing Paddle model...
[ERROR][Paddle2ONNX] [pad3d: pad3d_0.tmp_0] NDHWC format is not supported.
[Paddle2ONNX] Due to the operator: pad3d, this model cannot be exported to ONNX.
[ERROR][Paddle2ONNX] [pad3d: pad3d_1.tmp_0] NDHWC format is not supported.
[Paddle2ONNX] Due to the operator: pad3d, this model cannot be exported to ONNX.
[ERROR] Model exporting failed, you can report this problem to https://github.com/PaddlePaddle/Paddle2ONNX.git.

I used to be able to convert SRN to ONNX without any problems What could be the problem?

Hi As far as I know, SRN doesn't convert in ONNX correctly with help actual Paddle2ONNX v1.0.1
Some time ago I tried to convert SRN with an older version of Paddle2ONNX (e.g. 0.9) and it worked fine

susanin1970 avatar Dec 01 '22 07:12 susanin1970

Link on my SRN inference model: https://drive.google.com/file/d/1nYDrG9zDO6Sjo7oYvN57mmmwy3jNzGL7/view?usp=sharing

hi, i want to know paddle2onnx version for SRN model in paddle, u use it ? please reply me

duong0411 avatar Sep 07 '23 04:09 duong0411

Link on my SRN inference model: https://drive.google.com/file/d/1nYDrG9zDO6Sjo7oYvN57mmmwy3jNzGL7/view?usp=sharing

hi, i want to know paddle2onnx version for SRN model in paddle, u use it ? please reply me

Hi ;)
I used Paddle2ONNX v. 0.9 for converting SRN to ONNX

susanin1970 avatar Sep 07 '23 09:09 susanin1970

Link on my SRN inference model: https://drive.google.com/file/d/1nYDrG9zDO6Sjo7oYvN57mmmwy3jNzGL7/view?usp=sharing

hi, i want to know paddle2onnx version for SRN model in paddle, u use it ? please reply me

Hi ;) I used Paddle2ONNX v. 0.9 for converting SRN to ONNX

i installed v 0.9 but i have problem

duong0411 avatar Sep 07 '23 09:09 duong0411