storchastic
storchastic copied to clipboard
Refactor: Use torchdim instead of Storchastics plating system
Storchastic uses a rather intricate system for batching over multiple dimensions, but it's rather buggy and hard to work with for end users. Recently PyTorch 1.12 introduced torchdim with first-class dimension objects that should serve the same purpose as plates in Storchastic. We will likely have much cleaner, easier to read, write and debug and faster code by adopting this new standard. See https://colab.research.google.com/drive/1BsVkddtVMX35aZAvo2GyI-wSFPVBCWuA#scrollTo=8511a637
It implements
- Implicit batching: Two batch dimensions are joined together, just like in Storchastic.
- Mixed named tensors: Only batch dimensions need to be named, event dimensions in Storchastic can just be numeric