PiPPy
PiPPy copied to clipboard
Split each layer in multiple gpu
I tried the example_train.py with distribuited pytorch and it work well. Just the example split the network layer by layer and assign each one to a single machine. Is possible to split each single layer between multiple machines or each layer shall be managed only by a single machines/GPU?