Pooja Krishnan
Pooja Krishnan
**Describe the bug** The following backend test fails during pull request test run- https://github.com/onnx/onnx-tensorflow/runs/5982911863?check_suite_focus=true ``` AttributeError: in user code: File "/home/runner/work/onnx-tensorflow/onnx-tensorflow/onnx_tf/backend_tf_module.py", line 99, in __call__ * output_ops = self.backend._onnx_node_to_tensorflow_op(onnx_node, File...
The exported TF model doesn't use the output names specified while exporting the onnx models. The onnx output model names are: ```onnx_output_names =[node.name for node in model.graph.output]``` > onnx output...
**Describe the bug** The model converted from ONNX to TF on validation returns NaN as result. **To Reproduce** 1. export fairseq's transformer lm model to onnx 2. export the onnx...
### System Info Transformer version- 4.24 Torch version> 1.11 Stacktrace: ``` venv/lib/python3.8/site-packages/transformers/models/transfo_xl/modeling_transfo_xl.py:1115: in forward softmax_output = self.crit(pred_hid, labels) venv/lib/python3.8/site-packages/torch/nn/modules/module.py:1190: in _call_impl return forward_call(*input, **kwargs) venv/lib/python3.8/site-packages/torch/nn/modules/module.py:1178: in _slow_forward result = self.forward(*input,...
We would like to use Deepspeed Inference engine for drawing inference on Fairseq based Transformer LM model. Currently we run into error while loading the deepspeed trained checkpoint. **Describe the...