texar-pytorch
texar-pytorch copied to clipboard
Error when decoder has more than 1 layer.
https://github.com/asyml/texar-pytorch/blob/0ba18bff28cd8fff2640021354c15dfd4aef2f72/examples/vae_text/config_lstm_yahoo.py#L62
The output is the follwoing: RuntimeError: Input batch size 128 doesn't match hidden[0] batch size 256
The issue is due to the "initial_state=lstm_states" when the decoder is forwarded.