tensorflow-onnx
tensorflow-onnx copied to clipboard
Ability to control tf2onnx graph optimizations (for training)
When using tf2onnx to convert the tf model for training, we would like to disable some of the optimizations.
For example, we don't want to fold constants since folding constants will change the total number of parameters of the model that can be learnt.
Another optimization that we want to control is the merge_duplication optimizer. Merging Constant
nodes into a single node (thereby sharing the constant across the model) might again lead to changing the number of trainable parameters in the onnx model.
For now, we are using a monkey patch:
del tf2onnx.optimizer._optimizers["fold_constants"]
del tf2onnx.optimizer._optimizers["merge_duplication"]
But it would be good to be able to control these optimizations without performing this monkey patch.