auto_LiRPA icon indicating copy to clipboard operation
auto_LiRPA copied to clipboard

`nan` CROWN bounds when clamping

Open cherrywoods opened this issue 6 months ago • 0 comments

Describe the bug I am trying to bound a clamp operation. Since using torch.clamp directly produces an error stating that Cast is an unsupported operation, I instead tried torch.minimum(torch.maximum(x, mins), maxs) to replace clamp. This does not report any unsupported operations but yields nan bounds everywhere.

To Reproduce

import torch
from torch import nn
from auto_LiRPA import BoundedModule, BoundedTensor, PerturbationLpNorm

class Test(nn.Module):
    def __init__(self):
        super().__init__()
        x = nn.Parameter(0.5 * torch.ones(1, 4))
        y = nn.Parameter(0.75 * torch.ones(1, 4))
        self.register_buffer("x", x)
        self.register_buffer("y", y)
    def forward(self, z):
        return torch.minimum(torch.maximum(z, self.x), self.y)

module = BoundedModule(Test(), torch.empty(1, 4))
ptb = PerturbationLpNorm(x_L=torch.zeros(1, 4), x_U=torch.ones(1, 4))
t = BoundedTensor(torch.zeros(1, 4), ptb)
bounds = module.compute_bounds(x=(t,), method="ibp")  # produces the correct bounds
print(bounds)
# (tensor([[0.5000, 0.5000, 0.5000, 0.5000]], grad_fn=<MinimumBackward0>), tensor([[0.7500, 0.7500, 0.7500, 0.7500]], grad_fn=<MinimumBackward0>))
bounds = module.compute_bounds(x=(t,), method="CROWN")  # produces nan
print(bounds)
# (tensor([[nan, nan, nan, nan]], grad_fn=<ViewBackward0>), tensor([[nan, nan, nan, nan]], grad_fn=<ViewBackward0>))

System configuration:

  • OS: Ubuntu 22.04.3 LTS
  • Python version: Python 3.10
  • Pytorch Version: PyTorch 1.12.1
  • Hardware: CPU only (also verified on CUDA: GeForce GT 1030)
  • Have you tried to reproduce the problem in a cleanly created conda/virtualenv environment using official installation instructions and the latest code on the main branch?: Yes

cherrywoods avatar Dec 13 '23 17:12 cherrywoods