Ensemble-Pytorch icon indicating copy to clipboard operation
Ensemble-Pytorch copied to clipboard

Ensembling Methods incompatible with snnTorch models

Open kgano-ucsd opened this issue 2 years ago • 1 comments

Hi,

I have been trying to set up GradientBoosting for an snnTorch model I am working on it's mostly PyTorch in the background. However, I've run into a circular issue that I have yet to find a solution for:

Originally, my inputs for my train/test loaders for my feed forward snnTorch model were all dtype torch.float. I got this error:

     33 onehot = torch.zeros(label.size(0), n_classes).float().to(label.device)
---> 34 onehot.scatter_(1, label.view(-1, 1), 1)
     36 return onehot

RuntimeError: scatter(): Expected dtype int64 for index

In an attempt to fix this, I tried changed the type for all my inputs to be dtype torch.int64, but got this error:

    113 def forward(self, input: Tensor) -> Tensor:
--> 114     return F.linear(input, self.weight, self.bias)

RuntimeError: mat1 and mat2 must have the same dtype

For an input to require_grad, the tensor must be a float, so changing dtype in my Linear layers don't help, either.

What could be going wrong? Since snnTorch is an extension of PyTorch, I was hoping that Ensemble-Pytorch would also be compatible, but if there is some core compatibility issues I understand. Thanks in advance!

Edit: To clarify, ensemble.fit exposes these issues.

kgano-ucsd avatar Feb 02 '23 07:02 kgano-ucsd

Hi @kgano-ucsd, sorry for the late response. I think the reason is that the size of one hot encoded vector mismatches the input dim of your model.

xuyxu avatar Feb 12 '23 04:02 xuyxu