PiPPy icon indicating copy to clipboard operation
PiPPy copied to clipboard

Split each layer in multiple gpu

Open EnricoBeltramo opened this issue 1 year ago • 0 comments

I tried the example_train.py with distribuited pytorch and it work well. Just the example split the network layer by layer and assign each one to a single machine. Is possible to split each single layer between multiple machines or each layer shall be managed only by a single machines/GPU?

EnricoBeltramo avatar Jun 04 '23 15:06 EnricoBeltramo