cruise icon indicating copy to clipboard operation
cruise copied to clipboard

Implement multi-threaded Trainer.

Open sj6077 opened this issue 8 years ago • 2 comments

As discussed in offline, multi -threaded Trainer should be implemented. Each Trainer thread shared training data to compute and they run until all of the current batch data is computed.

sj6077 avatar Sep 12 '16 09:09 sj6077

When we implement multi-threaded Trainer, we can consider two versions for threads to write their gradient updates:

  1. Synchronized fashion
  2. Hogwild-style (lock-free)

We should build both versions and compare the performance of them.

yunseong avatar Oct 04 '16 09:10 yunseong

I'll start to send a PR that enables multi-thread in MLR of consistent (i.e., non-hogwild) version first.

yunseong avatar Jan 09 '17 10:01 yunseong