wangbin
Results
1
issues of
wangbin
Thanks for you tutorial! I found code in models.py: line 80: ``` att1 = self.encoder_att(encoder_out) # (batch_size, num_pixels, attention_dim) ``` and line 203: ``` attention_weighted_encoding, alpha = self.attention(encoder_out[:batch_size_t], h[:batch_size_t]) ```...