EthanObadia
Results
1
comments of
EthanObadia
I found this bug using the code bellow on `main`: ```python3 import torch import qadence from qadence.blocks.matrix import MatrixBlock XMAT = torch.tensor([[0, 1], [1, 0]], dtype=torch.cdouble) matblock = MatrixBlock(XMAT, (0,))...