onnxruntime_backend
onnxruntime_backend copied to clipboard
onnxruntime inference seesion params setting
trafficstars
Background:
My onnx model include Dropout, which is executed as training mode.However, onnxruntime will optimize Dropout ops by default.So, I calls session = ort.InferenceSession(modelPath, disabled_optimizers=["EliminateDropout"]) to avoid that.
Question:
What should I do, when I deploy my model inference service by triton to achieve my goal. Thanks a lot!