pytorch_sparse
pytorch_sparse copied to clipboard
SparseStorage tensor.Long assert
Hi, thank you for your time on developing this project. I have a question that I think is rather simple: Why do you have to enforce that the SparseStorage's col and other params to be tensor.Long? Is it really necessary? Thanks
Currently, that is necessary. The reason for that is that a lot of functionality in PyTorch assumes indices to be of type torch.long, e.g., torch.index_select() only works with torch.long indices. I heard that PyTorch team is actively working on more diverse types for these operators in recent versions, so one day this shouldn't be necessary anymore.
Thanks, I'm in a situation where forcing me to use LongTensors is really a waste of memory (IntTensor is enough) that's why the question. Looking forward to the mentioned work of the PyTorch team.
This issue had no activity for 6 months. It will be closed in 2 weeks unless there is some new activity. Is this issue already resolved?