attention-learn-to-route icon indicating copy to clipboard operation
attention-learn-to-route copied to clipboard

About num_steps

Open cjdjr opened this issue 5 years ago • 1 comments

I am confused about the function _get_parallel_step_context() in nets/attention_model.py#L367. What does num_steps mean? Could you please explain it in detail? Thanks! o( ̄▽ ̄)ブ

cjdjr avatar Dec 03 '20 12:12 cjdjr

Hi,

See also #29. The reason that there is a steps dimension is that this can be used to evaluate the model on a given tour in a single forward pass, which is much more efficient than one step at the time. This could be useful for, e.g., supervised training (teacher forcing) or things like experience replay. This code is a leftover of some early experiments in that direction which I thought may still be useful to somebody.

Wouter

wouterkool avatar Dec 04 '20 11:12 wouterkool