NNEF-Tools
NNEF-Tools copied to clipboard
ValueError: Importing a SavedModel with tf.saved_model.load requires a tags= argument if there is more than one MetaGraph. Got tags=None, but there are 0 MetaGraphs in the SavedModel with tag sets: []. Pass a tags= argument to load this SavedModel.
Dear NNEF programers,
This error occurred while converting from ONNX to TF.
Enviroment
In Google Colab, we installed these packages.
!pip install onnx
!git clone https://github.com/KhronosGroup/NNEF-Tools.git
!cd /content/NNEF-Tools && python setup.py install
!cd /content/NNEF-Tools/parser/python && python setup.py install
!pip install future typing six numpy protobuf torch
!pip install future typing six numpy protobuf onnx onnx-simplifier onnxruntime
!pip install onnx==1.8.1
!pip install torch
!pip install torchvision
!pip install numpy
!pip install onnx
!pip install onnx2keras
Convert
With onnx file, we tried to convert from ONNX to TF.
!python -m nnef_tools.convert --input-format onnx --output-format nnef --input-model /content/pytorch2onnx.onnx --output-model onnx2nnef.nnef --fold-constants
!python -m nnef_tools.convert --input-format nnef --output-format tf --input-model onnx2nnef.nnef --output-model nnef2tf.pb
Load models
After conversion, I got the following error message while loading the TF model. Import statements that are not important are skipped.
import tensorflow as tf
load_model = tf.keras.models.load_model('/content/')
print(list(loaded.signatures.keys()))
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-10-0f33b0d08d70> in <module>
3 #loaded_model.summary()
4
----> 5 loaded = tf.saved_model.load("/content/")
6 print(list(loaded.signatures.keys()))
4 frames
/usr/local/lib/python3.7/dist-packages/tensorflow/python/saved_model/load.py in load(export_dir, tags, options)
934 ValueError: If `tags` don't match a MetaGraph in the SavedModel.
935 """
--> 936 result = load_internal(export_dir, tags, options)["root"]
937 return result
938
/usr/local/lib/python3.7/dist-packages/tensorflow/python/saved_model/load.py in load_internal(export_dir, tags, options, loader_cls, filters)
992 " version) cannot be loaded with node filters.")
993 with ops.init_scope():
--> 994 root = load_v1_in_v2.load(export_dir, tags)
995 root.graph_debug_info = debug_info
996
/usr/local/lib/python3.7/dist-packages/tensorflow/python/saved_model/load_v1_in_v2.py in load(export_dir, tags)
280 metrics.IncrementReadApi(_LOAD_V1_V2_LABEL)
281 loader = _EagerSavedModelLoader(export_dir)
--> 282 result = loader.load(tags=tags)
283 metrics.IncrementRead(write_version="1")
284 return result
/usr/local/lib/python3.7/dist-packages/tensorflow/python/saved_model/load_v1_in_v2.py in load(self, tags)
205 def load(self, tags):
206 """Creates an object from the MetaGraph identified by `tags`."""
--> 207 meta_graph_def = self.get_meta_graph_def_from_tags(tags)
208 load_shared_name_suffix = "_load_{}".format(ops.uid())
209 functions = function_deserialization.load_function_def_library(
/usr/local/lib/python3.7/dist-packages/tensorflow/python/saved_model/load_v1_in_v2.py in get_meta_graph_def_from_tags(self, tags)
78 for mg in self._saved_model.meta_graphs]
79 raise ValueError(
---> 80 "Importing a SavedModel with `tf.saved_model.load` requires a "
81 "`tags=` argument if there is more than one MetaGraph. Got "
82 f"`tags=None`, but there are {len(self._saved_model.meta_graphs)} "
ValueError: Importing a SavedModel with `tf.saved_model.load` requires a `tags=` argument if there is more than one MetaGraph. Got `tags=None`, but there are 0 MetaGraphs in the SavedModel with tag sets: []. Pass a `tags=` argument to load this SavedModel.
I am asking how I can solve this problem.
Thank you very much for your time.
Sincerely,
Hi, It seems that the code you are using to read the model expects a 'saved-model' format as input, which is a newer format of TensorFlow, while the NNEF tools work with the older frozen graph format. The converter can only output this older format, and hence your code is not able to read it. Note that the NNEF converter tools are not really meant to serve as a conversion tool from ONNX to TF, although it happens to be a possible usage path. For that purpose, ONNX has its own tools, and NNEF is not necessary as an intermediate step. May I ask why you want to use NNEF this way instead of converting from ONNX to TF?
@S-hyun any comment on this? Can I close the issue?