neuron icon indicating copy to clipboard operation
neuron copied to clipboard

separate the data parallelization from model parallelization

Open bobye opened this issue 11 years ago • 2 comments

change backpropagate() to two versions (one is sequential in data, one is parallel in data)

bobye avatar Nov 25 '14 02:11 bobye

Another workaround is to consider pass derivative as explicit outputs, and use aggregate to obtain overall gradient.

bobye avatar Nov 25 '14 02:11 bobye

I like the second solution to handle data parallelization: It always keeps the status immutable.

bobye avatar Nov 25 '14 21:11 bobye