anomalib
anomalib copied to clipboard
[Task]: custom torch module as a backbone as feature extractor backbone
What is the motivation for this task?
- I'd like to load a custom torch module as a backbon in the
TorchFXFeatureExtractor
Describe the solution you'd like
Can be done with
module_path = Path(module_path)
spec = importlib.util.spec_from_file_location(module_path.stem, module_path)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
Additional context
In terms of API, the backbone
arg in TorchFXFeatureExtractor
can be overloaded to support it (one can check if there is .py
in the string.
I did this on my fork.
On the way, a secondary suggestion: that BackboneParams
seems unnecessary increase of complexity, it would be more functional straigth forward to manage everything in TorchFXFeatureExtractor.__init__
.
@jpcbertoldo I am trying to understand this. We currently support passing the model class to the feature extractor https://github.com/openvinotoolkit/anomalib/blob/main/tests/pre_merge/models/test_feature_extractor.py#L76. Is what you are trying to achieve different from this?
Yeah backbone params add a bit of complexity but the idea was to encapsulate the required parameters into a container so that the Lightning CLI can parse it. Since the design is still ongoing maybe we can look into avoiding it.
@jpcbertoldo I am trying to understand this. We currently support passing the model class to the feature extractor https://github.com/openvinotoolkit/anomalib/blob/main/tests/pre_merge/models/test_feature_extractor.py#L76. Is what you are trying to achieve different from this?
Sorry, should have been more specific. Yes, the example you showed does it, but it requires the user to instantiate the class by hand, so one cannot use a custom class via CLI/config file (which is what i'm doing).
Yeah backbone params add a bit of complexity but the idea was to encapsulate the required parameters into a container so that the Lightning CLI can parse it. Since the design is still ongoing maybe we can look into avoiding it.
I get the point, but couldn't the cli just accept the kwargs for TorchFXFeatureExtractor
directly?
- So if I understand your use case then this line should be modified to make it more flexible https://github.com/openvinotoolkit/anomalib/blob/6b5f4c79c15e4a6c47c93b946d84d430028f8674/anomalib/models/components/feature_extractors/torchfx.py#L113 If I change this line to
backbone_model = backbone_class(**backbone.init_args)
then this works
>>> from anomalib.models.components import TorchFXFeatureExtractor
>>> TorchFXFeatureExtractor("tests.helpers.dummy.DummyModel", return_nodes=["fc1"])
TorchFXFeatureExtractor(
(feature_extractor): DummyModel(
(conv1): Conv2d(3, 32, kernel_size=(3, 3), stride=(1, 1))
(conv2): Conv2d(32, 32, kernel_size=(5, 5), stride=(1, 1))
(conv3): Conv2d(32, 1, kernel_size=(7, 7), stride=(1, 1))
(fc1): Linear(in_features=400, out_features=256, bias=True)
)
)
So maybe you can pass your model to config as a class path. So something like this.
model:
backbone:
class_path: tests.helpers.dummy.DummyModel
init_args:
param1:
...
- As for this, how would we define the structure in config as
model:
some_param:
backbone:
class_path:
init_args:
If we we remove BackboneParams
and allow kwargs
directly then checking the format might become a bit difficult. Right now this line takes kwargs and tries to parse it in the required format and will throw an error if the format differs.
if isinstance(backbone, dict):
backbone = BackboneParams(**backbone)
I am not sure what's a better way to approach this. Feel free to suggest your ideas.
It's that but there's an extra catch.
I actually do class_path: some/python/module.py::DummyModel
because there is an incompatibility between anomalib and the repo where i get the class from, so i cannot have both installed and have to specify the file path because it is not in the python path.
I am not sure what's a better way to approach this. Feel free to suggest your ideas.
Ok.
As this is addressed in v1, I'm closing this. If you still think this is an issue, you could re-open it or create a new one.