jinfengr

Results 4 issues of jinfengr

The Adda results are indeed impressive. But I am wondering how it compares to: 1) train on MNIST, fine-tune on the small USPS dataset 2) mixes MNIST and small USPS...

Existing implementation doesn't support forward/backward with batch of trees as inputs, which is slow in training and inference. The pull requests support batch operation for TreeLSTM, and reproduces the *exact*...

It seems batch size is still not supported from the code? In the forward function of ChildSumTreeLSTM, it seems that it only support process a single tree in one forward....

In the SoftMaxTree example given in the readme, it assumed we already know the label of target. ``` > input = torch.randn(5,10) > target = torch.IntTensor{20,24,27,10,12} > gradOutput = torch.randn(5)...