tensorflow-onnx
tensorflow-onnx copied to clipboard
TFJS conversion bug
Describe the bug When converting tfjs model I get the following exception:
Traceback (most recent call last):
File "/usr/local/Cellar/[email protected]/3.9.13_1/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/local/Cellar/[email protected]/3.9.13_1/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/usr/local/lib/python3.9/site-packages/tf2onnx/convert.py", line 710, in <module>
main()
File "/usr/local/lib/python3.9/site-packages/tf2onnx/convert.py", line 273, in main
model_proto, _ = _convert_common(
File "/usr/local/lib/python3.9/site-packages/tf2onnx/convert.py", line 168, in _convert_common
g = process_tf_graph(tf_graph, const_node_values=const_node_values,
File "/usr/local/lib/python3.9/site-packages/tf2onnx/tfonnx.py", line 456, in process_tf_graph
main_g, subgraphs = graphs_from_tfjs(tfjs_path, input_names, output_names, shape_override,
File "/usr/local/lib/python3.9/site-packages/tf2onnx/tfjs_utils.py", line 306, in graphs_from_tfjs
main_g = read_tfjs_graph(topology['node'], weights, None, input_names, output_names, shape_override,
File "/usr/local/lib/python3.9/site-packages/tf2onnx/tfjs_utils.py", line 462, in read_tfjs_graph
out_shapes = get_output_shapes(node_def, inp_dtypes, inp_shapes, inp_consts)
File "/usr/local/lib/python3.9/site-packages/tf2onnx/tfjs_utils.py", line 213, in get_output_shapes
tf.import_graph_def(mini_graph_def, name='')
File "/usr/local/lib/python3.9/site-packages/tensorflow/python/util/deprecation.py", line 561, in new_func
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/tensorflow/python/framework/importer.py", line 403, in import_graph_def
return _import_graph_def_internal(
File "/usr/local/lib/python3.9/site-packages/tensorflow/python/framework/importer.py", line 505, in _import_graph_def_internal
raise ValueError(str(e))
ValueError: NodeDef missing attr 'TArgs' from Op<name=_FusedConv2D; signature=input:T, filter:T, args:, host_args:num_host_args*float -> output:T; attr=T:type,allowed=[DT_HALF, DT_FLOAT, DT_DOUBLE, DT_INT8, DT_QINT8]; attr=TArgs:list(type),min=1; attr=num_args:int,min=0; attr=num_host_args:int,default=0,min=0; attr=strides:list(int); attr=padding:string,allowed=["SAME", "VALID", "EXPLICIT"]; attr=explicit_paddings:list(int),default=[]; attr=data_format:string,default="NHWC",allowed=["NHWC", "NCHW", "NCHW_VECT_C"]; attr=filter_format:string,default="HWIO",allowed=["HWIO", "OIHW", "OIHW_VECT_I"]; attr=dilations:list(int),default=[1, 1, 1, 1]; attr=use_cudnn_on_gpu:bool,default=true; attr=fused_ops:list(string),default=[]; attr=epsilon:float,default=0.0001; attr=leakyrelu_alpha:float,default=0.2>; NodeDef: {{node node}}
Urgency none
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 18.04*): macOS 14.0
- TensorFlow Version: 2.11.1
- Python version: 3.9
- ONNX version (if applicable, e.g. 1.11*): 1.12.0
- ONNXRuntime version (if applicable, e.g. 1.11*): 1.12.0
To Reproduce Using the attached test.json tfjs model spec python3 -m tf2onnx.convert --tfjs ./test.json --output test.onnx
Screenshots
Additional context Looks like some info is not propagated to the nodedef, I wee TArgs in Op, but NodeDef is just {{node node}} Tried with different opsets (18, 15, 11), the result was the same.
Options with --debug --verbose: 2023-08-04 19:40:40,954 - INFO - tf2onnx: inputs: None 2023-08-04 19:40:40,954 - INFO - tf2onnx: outputs: None 2023-08-04 19:40:40,957 - INFO - tf2onnx.tfonnx: Using tensorflow=2.11.1, onnx=1.12.0, tf2onnx=1.14.0/8f8d49 2023-08-04 19:40:40,957 - INFO - tf2onnx.tfonnx: Using opset <onnx, 15> test.json.zip
Fixed by falling back to tensorflow==2.9.3
The issue #2118 has been filed to track this issue, and it's on the plate now.