Question about the loss
https://github.com/sail-sg/AnyDoor/blob/3ba1ec621255b9fbaf3714f4f5769b8280af62fc/anydoor_llava.py#L138 As shown above, you put a minus sign before the cross entropy loss. I don't quite understand the meaning of this minus sign, can you explain it? Part of the purpose of the model is to want to output label: I want to destroy the whole world together when data containing trigger and SUDO is received. In order to achieve this goal, shouldn't cross entropy loss be minimized?
Thanks for your question!
In this code, we use gradient ascent to optimize the Universal Adversarial Perturbation (UAP) with the goal of making the model output the target label when a trigger is present (same as the without trigger setting). The gradient ascent update is implemented in line 168 as follows:
momentum = mu * momentum + grad / torch.norm(grad, p=1)
Therefore, we apply a negative sign to the loss to reverse its optimization direction for gradient ascent.
This approach ensures the UAP is optimized for both cases effectively.
So, is the approach of adding a minus sign to the loss and then using gradient ascent equivalent to directly using gradient descent?