DeepNLP-models-Pytorch
DeepNLP-models-Pytorch copied to clipboard
Pytorch implementations of various Deep NLP models in cs-224n(Stanford Univ)
I have learned a lot from this elegant project. Thanks a lots! Based on the equation in the Skip-gram-Negative-Sampling algorithm below,  I think the negative example loss calculated by...
I was checking out Question and Answering using Dynamic Memory Network (DMN) for BabI dataset from this source: [10.Dynamic-Memory-Network-for-Question-Answering.ipynb](https://nbviewer.jupyter.org/github/DSKSD/DeepNLP-models-Pytorch/blob/master/notebooks/10.Dynamic-Memory-Network-for-Question-Answering.ipynb) I modified it above a bit so that I can save...
Hi, thank you for the great repository! However, I found that for QA type classification, the code includes the sub-type into the training/testing data. Needless to say it's a perfect...
Hi, I have a little question about file 08.CNN-for-Text-Classification.ipynb, [96], line 4: pretrained.append(model[word2index[key]]). word2index[key] means to find key's index, then you should find its pretrained embedding in GoogleNews-vectors-negative300.bin. But the...
Hi, In file 08.CNN-for-Text-Classification.ipynb, where do you pad the input? Is it in [110], line 7: x_p.append(torch.cat([x[i], Variable(LongTensor([word2index['']] * (max_x - x[i].size(1)))).view(1, -1)], 1))? Thanks!
안녕하세요 좋은 자료 공유해주셔서 정말 감사합니다. 관련 내용을 공부하면서 정말 많은 도움을 받고 있습니다. Issue에 글을 쓰게된 이유는 다름이 아니라 08.CNN 예제에서 logsoftmax와 cross-entropy의 중복과 관련된 내용을 문의드리기 위함입니다. CNNClassifier의...
I want to save model for Neural Machine Translation (https://nbviewer.jupyter.org/github/DSKSD/DeepNLP-models-Pytorch/blob/master/notebooks/07.Neural-Machine-Translation-with-Attention.ipynb). Can you help me ?
Step [54] "data = [[d.split(':')[1][:-1], d.split(':')[0]] for d in data]" seems to include the sub-category output into the input sequence. For example, for data line "DESC:def What is ethology ?",...
Hi SungDong. Thanks for the great posts. I am reading the first two models on skip-gram. Why do you use two embedding instead of one? The second embedding_u has all...
In the 4th jupyter file named `Word Window Classification and Neural Networks`, I found something wrong. Specifically, in the class `WindowClassifier`, you has used `self.softmax = nn.LogSoftmax(dim=1)`in the output layer,...