adversarial-attacks-pytorch
adversarial-attacks-pytorch copied to clipboard
[FEATURE] Allow parametrization of use of "device"
✨ Short description of the feature [tl;dr]
Currently device is automatically inferred from model parameters and images / labels are transferred on this device.
It could be interesting to have the possibility of removing usage of the device so it can be handled automatically by other frameworks (i.e lightning).
💬 Detailed motivation and codes
My main motivation is to use torchattacks to do adversarial training with lightning on multigpu. It is currently not possible because of the transfer to self.device in forward method of the attacks.
See https://github.com/Lightning-AI/lightning/discussions/14782 for an example of lightning module for adversarial training (self.atk is meant to be a Attack object)
If think adding this possibility in torchattacks could go like this:
In Attack base class:
def __init__(self, name, model, use_device = True):
r"""
Initializes internal attack state.
Arguments:
name (str): name of attack.
model (torch.nn.Module): model to attack.
use_device (bool): Transfer batches on the same device as model. Recommended to set to False if devices are already handled by other libraries e.g. pytorch lightning. (Default: True)
"""
self.attack = name
self.model = model
self.use_device = use_device
In attacks implementations:
images = images.clone().detach().to(self.device) if self.use_device else images.clone().detach()
(same everywhere images or labels are transferred to self.device)
This way default behavior remains the same.
How about adding a new function set_device()
for change device?
In this case, you can disable .to(self.device)
by set_device(None)
.
Please refer to https://github.com/Harry24k/adversarial-attacks-pytorch/blob/8b35fb59b7447b9adf4ce8b7fa9fc8b2f9d637f4/torchattacks/attack.py#L57.
How about adding a new function
set_device()
for change device? In this case, you can disable.to(self.device)
byset_device(None)
. Please refer tohttps://github.com/Harry24k/adversarial-attacks-pytorch/blob/8b35fb59b7447b9adf4ce8b7fa9fc8b2f9d637f4/torchattacks/attack.py#L57 .
Yes it sounds like a much better approach. I will try it with my use case and report.
set_device works with multi-gpu on pytorch lightning with "ddp" strategy. Still have devices issue with dp, the model does not change device. But dp is discouraged anyway (https://pytorch-lightning.readthedocs.io/en/latest/accelerators/gpu_intermediate.html) so the addition of set_device is enough for my case.
Great! Thanks a lot for your response.