Hierarchical-attention-networks-pytorch icon indicating copy to clipboard operation
Hierarchical-attention-networks-pytorch copied to clipboard

Is init hidden state necessary?

Open littleflow3r opened this issue 5 years ago • 0 comments

Hi,

In your hierarchical_att_model.py, you initialized the hidden state for the GRU, with zeros. self.word_hidden_state = torch.zeros(2, batch_size, self.word_hidden_size) self.sent_hidden_state = torch.zeros(2, batch_size, self.sent_hidden_size)

According to the torch documentation of GRU, the h_0 will default to zero if not provided. Do you have any reason to do this manually?

Thanks.

littleflow3r avatar Feb 28 '19 01:02 littleflow3r