seabay
Results
2
comments of
seabay
Hi @Engine-Treasure I think that number of layers is nothing related to Bi-direction, for example, encoder is a 2-layered Bi-RNN, so the hidden state has (2 * 2)=4 parts, the...
hi @Engine-Treasure Based on my experiments, codes match the first layout which alternates in layers. But why @spro choose the first layer as the context vector for Decoder?