optimum
optimum copied to clipboard
Fix table transformers cuda export
As per title!
Issue: #1774
@mht-sharma I'm not sure about this - actually we may be better off sticking with GraphOptimizationLevel::ORT_ENABLE_ALL during validation as this is the default of ORT, better to catch issues ahead?
But then again someone may use the ONNX export and then not use ORT.
@mht-sharma I'm not sure about this - actually we may be better off sticking with
GraphOptimizationLevel::ORT_ENABLE_ALLduring validation as this is the default of ORT, better to catch issues ahead? But then again someone may use the ONNX export and then not use ORT.
I agree with your points. Perhaps such a test would be more suitable for ORTModel than export? Given the current error message, users might assume that the model isn't supported, when in fact they simply need to disable the optimizations?
Alternative, is obviously to have conditional statements to make it pass.