DifferentiableModelWrapper equivalent for Expectation Over Transformation?
Following what was discussed in #191, I'm trying to implement Expectation Over Transformation in a similar fashion. The original poster mentioned using DifferentiableModelWrapper to do so, but that is obviously no longer available. The only wrapper class that seems to be available currently is a ThresholdingWrapper. I was wondering if anyone could help point me in the right direction for accomplishing this.
Ideally, I would think implementing EOT would involve having a wrapper around a PyTorchModel object that basically averages the gradients from that model then returns the final, averaged gradients as per EOT.
This is certainly something we should add back to Foolbox 3 in one way or the other. Unfortunately, I probably won't have to time to work on this in the near future but a PR would be very welcome. Ideally, it should be a solution that works for all frameworks, but I would need to think more about it to know what the right approach is.
Hi,
are there any updates on this? By checking the documentation I did not find any implementation of Expectation over Transformation.
Do you have any suggestions on how to implement it?
Right now there is no implementation for this in foolbox. However, this should be pretty easy to implement. You could start by creating a class that inherits from the framework-specific model classes (e.g., PyTorchModel) and then modify the internal def _model definition to call the inner model multiple times and the output over those repetitions.
An even better approach would be to create a new class that takes a PyTorch/JaxModel as an argument and averages over multiple inner calls when computing the forward pass.
If you want to try this out, it would be great if you could create a PR for this.
Hi, and thank you @zimmerrol for your answer.
I opened a pull request #719, where I tried to follow your suggestions. Let me know what you think of it.