anomalib icon indicating copy to clipboard operation
anomalib copied to clipboard

[Task]: Add unit tests for `Engine.export`

Open ashwinvaidya17 opened this issue 1 year ago • 7 comments

What is the motivation for this task?

Currently, export types like NNCF and POT do not fall under testing. This leads to outdated docstrings and broken functionality.

Describe the solution you'd like

The test can be as simple as optimizing a pre-trained Padim model. We already have the trained checkpoint available in a fixture.

Additional context

No response

ashwinvaidya17 avatar Aug 26 '24 08:08 ashwinvaidya17

Can you provide more context regarding the issue?

mastaan66 avatar Sep 22 '24 07:09 mastaan66

I would like to work on this issue.

haroon0x avatar Nov 14 '24 05:11 haroon0x

@sky0walker99, sure, thanks for your interest. @ashwinvaidya17, can you provide an acceptance criteria for @sky0walker99

samet-akcay avatar Nov 14 '24 10:11 samet-akcay

OpenVINO export also supports post training optimization, and neural network compression. These can be enabled by passing additional parameters to Engine.export. While we have tests for Engine.export https://github.com/openvinotoolkit/anomalib/blob/bcc0b439f616b13a8629cb64d8bf0f88fc9083a8/tests/integration/cli/test_cli.py#L159 The tests do not check these two features of OpenVINO export. Since the API has changed over time, the PTQ and NNCF CLI invocation has diverged from that mentioned here in the docstrings. https://github.com/openvinotoolkit/anomalib/blob/bcc0b439f616b13a8629cb64d8bf0f88fc9083a8/src/anomalib/engine/engine.py#L937 Also, since these don't fall under the coverage tests, we don't come to know when they break.

The solution to address this can be as simple as extending the OpenVINO export tests to add POT and NNCF parameters to the list so that we can then test 1. Normal OpenVINO export, 2. All the combination of compression types listed here https://github.com/openvinotoolkit/anomalib/blob/bcc0b439f616b13a8629cb64d8bf0f88fc9083a8/src/anomalib/deploy/export.py#L36. There is no need to check the accuracy or train the model from scratch. We can just use the existing checkpoint, and export the model loaded from it. The idea is to check if the model is saved to the file system.

ashwinvaidya17 avatar Nov 14 '24 11:11 ashwinvaidya17

I'm interested in working on this issue to add unit tests for Engine.py . Based on the description, I understand that: We need to test export types like NNCF and POT The goal is to verify model export functionality We'll use an existing pre-trained Padim model checkpoint We'll check if models are correctly saved to the file system

Could you confirm:

  1. Which specific Padim checkpoint should I use?
  2. Are there any specific compression types I should prioritize?
  3. Do you want me to create a separate test file or modify existing tests?

BhagyasriUddandam avatar Mar 25 '25 00:03 BhagyasriUddandam

@BhagyasriUddandam You can work on the issue.

haroon0x avatar Mar 25 '25 07:03 haroon0x

Hi, I’m really interested in working on this task for GSoC and was wondering if I could be assigned to it? I’d love to get started as soon as possible, as the GSoC application deadline is coming up.

rafia-10 avatar Apr 07 '25 14:04 rafia-10

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

github-actions[bot] avatar Jul 07 '25 05:07 github-actions[bot]

This issue was closed because it has been stalled for 14 days with no activity.

github-actions[bot] avatar Jul 21 '25 05:07 github-actions[bot]