beam icon indicating copy to clipboard operation
beam copied to clipboard

[Failing Test]: Onnx inference unit tests are failing.

Open tvalentyn opened this issue 1 year ago • 1 comments

What happened?

Due to test configuration issues, the tests were not part of tox unit test suite. Enabling them causes failures:

FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_invalid_input_type - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_model_handler_large_model - AssertionError: True is not false
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_model_handler_sets_env_vars - AssertionError: True is not false
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_pipeline_gcs_model - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_pipeline_local_model_simple - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxTensorflowRunInferencePipelineTest::test_invalid_input_type - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxTensorflowRunInferencePipelineTest::test_pipeline_gcs_model - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxTensorflowRunInferencePipelineTest::test_pipeline_local_model_simple - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxSklearnRunInferencePipelineTest::test_invalid_input_type - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxSklearnRunInferencePipelineTest::test_pipeline_gcs_model - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxSklearnRunInferencePipelineTest::test_pipeline_local_model_simple - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'

Issue Failure

Failure: Test is continually failing

Issue Priority

Priority: 1 (unhealthy code / failing or flaky postcommit so we cannot be sure the product is healthy)

Issue Components

  • [X] Component: Python SDK
  • [ ] Component: Java SDK
  • [ ] Component: Go SDK
  • [ ] Component: Typescript SDK
  • [ ] Component: IO connector
  • [ ] Component: Beam YAML
  • [ ] Component: Beam examples
  • [ ] Component: Beam playground
  • [ ] Component: Beam katas
  • [ ] Component: Website
  • [ ] Component: Spark Runner
  • [ ] Component: Flink Runner
  • [ ] Component: Samza Runner
  • [ ] Component: Twister2 Runner
  • [ ] Component: Hazelcast Jet Runner
  • [ ] Component: Google Cloud Dataflow Runner

tvalentyn avatar May 10 '24 20:05 tvalentyn

We should also fix enable onnx in dependency compat test suite: https://github.com/apache/beam/issues/25796. Beam supports protobuf3, so we should still be able to test onnx even if it doesn't support protobuf4.

tvalentyn avatar May 10 '24 21:05 tvalentyn