tensorflow-onnx icon indicating copy to clipboard operation
tensorflow-onnx copied to clipboard

Fix Edge Case: Shared Layer Output Between Model Output and Internal Layers

Open cbpark-nota opened this issue 3 months ago • 0 comments

Fix Edge Case: Shared Layer Output Between Model Output and Internal Layers

🎯 Summary

This PR addresses a specific edge case that occurs when converting TensorFlow models to ONNX. It improves the handling of cases where a layer's output is used simultaneously as both the input to another layer and the model's final output.

🔧 Changes

  • Edge case handling: Resolves the case where a layer's output is used as both the model output and the input to an internal layer.

  • Improved Transpose output logic: Uses identity nodes to handle output branching. Differentiates between model output consumers and internal consumers. Introduces a two-step process: edge case handling → regular case handling.

  • Optimizer order optimization: Moves remove_identity to the front of the optimizer chain.

  • Improved error messages: Provides clearer error messages during graph output validation.

Technical Details

Before: When a layer's output is connected to both the model output and internal layers, the transpose operation fails, causing the graph structure to break.

After: Output is branched through identity nodes. Transpose is applied only to the model output consumer. Internal consumers maintain the original output.

🔍 Files Changed

tf2onnx/tfonnx.py - Major improvements to the transpose_outputs function. tf2onnx/graph.py - Improved error messages and code cleanup. tf2onnx/optimizer/init.py - Changed optimizer order.

Test model file

This file is a test model created by extracting part of the edge case I encountered. https://drive.google.com/file/d/10ChR5OS4k6az1yG13vdbiassrsxYUkuJ/view?usp=sharing

cbpark-nota avatar Aug 15 '25 10:08 cbpark-nota