recurrentshop
recurrentshop copied to clipboard
what is the principle of readout and teacher forcing?
Thanks for the great work! It really help me a lot. But there is still one point that I can't figure out. What is the principle of readout and teacher forcing? How can we feeding the output(or ground truth) of RNN from the previous time step back to the current time step, by using the output as features together with the input of this step, or using the output as this step's cell state? I have read the code but still it confused me.o(╯□╰)o. Hoping someone can answer for me。
see docs/teacher_force.md and docs/readout.md
@farizrahman4u In docs/readout.md: for cell in cells: lstms_output, h, c = cell([lstms_output, h, c]) which means h and c passed to next layer, but isn't c an internal state? why would it be passed to another cell? Shouldn't there be two lists contaning h and c for each layer, and cell as a function recieve the state for it own?