pyqtorch
pyqtorch copied to clipboard
[Refactor] Batch dimension as first dimension in tensors
Problem
Right now we have the batch dimension at the back of tensors in PyQ. However, in PyTorch it is in front.
Consequence
This entails reshaping tensors back and forth to make them compatible with PyTorch/PyQ convention. Hence, there are unnecessary code lines just for that.
Suggested Solution
We can make it possible to have both conventions (with a keyword argument) or just one (for instance, the batch dimension is the first and not last in PyQ tensors).