Ensemble-Pytorch
Ensemble-Pytorch copied to clipboard
Ensembling Methods incompatible with snnTorch models
Hi,
I have been trying to set up GradientBoosting for an snnTorch model I am working on it's mostly PyTorch in the background. However, I've run into a circular issue that I have yet to find a solution for:
Originally, my inputs for my train/test loaders for my feed forward snnTorch model were all dtype torch.float. I got this error:
33 onehot = torch.zeros(label.size(0), n_classes).float().to(label.device)
---> 34 onehot.scatter_(1, label.view(-1, 1), 1)
36 return onehot
RuntimeError: scatter(): Expected dtype int64 for index
In an attempt to fix this, I tried changed the type for all my inputs to be dtype torch.int64, but got this error:
113 def forward(self, input: Tensor) -> Tensor:
--> 114 return F.linear(input, self.weight, self.bias)
RuntimeError: mat1 and mat2 must have the same dtype
For an input to require_grad, the tensor must be a float, so changing dtype in my Linear layers don't help, either.
What could be going wrong? Since snnTorch is an extension of PyTorch, I was hoping that Ensemble-Pytorch would also be compatible, but if there is some core compatibility issues I understand. Thanks in advance!
Edit: To clarify, ensemble.fit exposes these issues.
Hi @kgano-ucsd, sorry for the late response. I think the reason is that the size of one hot encoded vector mismatches the input dim of your model.