practical-pytorch
practical-pytorch copied to clipboard
Go to https://github.com/pytorch/tutorials - this repo is deprecated and no longer maintained
I was wondering if you could provide any information about where the names data was obtained and what if any license it's under.
Dear Robertson: I encountered a paper named ‘Efficient Neural Architecture Search via Parameter Sharing’ and I want to implement the network in pytorch. However, after I read the paper, I...
When running seq2seq-translation-batched cell that creates sconce Job, it fails after few seconds. I've noticed that http://sconce.prontotype.us site is down with a message: "Oh no. We can't load sconce.prontotype.us/ right...
Hi, thanks for this example. I'm fairly new to this, and thought I'd try out the seq2seq-translation-batched example. It includes the statements... > ## Requirements > You will need PyTorch...
Thanks for these tutorials. They are clear and easy to go through. As I am trying to get into building my own models with these tutorials I was trying to...
torch.__version__ is '0.4.0a0+60a16e5' Some changes of pytorch has broken the code in tutorial? ```5000 5% (0m 7s) nan Tokarchuk / Polish ✗ (Russian) 10000 10% (0m 14s) nan Auttenberg /...
I described the issue here, but got no answer: https://discuss.pytorch.org/t/questions-about-loss-averaging-in-seq2seq-tutorial/13440
Hi, your seq2seq tutorial is really helpful! May I ask why you explicitly set decoder_learning_ratio to 5? Is it often better to have larger learning ratio on the decoder? Thank...
Hi, all I have confusion about this: decoder_hidden = encoder_hidden[:decoder_test.n_layers] # Use last (forward) hidden state from encoder, should this be decoder_hidden = encoder_hidden[decoder_test.n_layers:] ? Because now it is the...
For seq2seq-translation-batched tutorial, In the `forward `function of `BahdanauAttnDecoderRNN`, It has a log_softmax function on the output: `output = F.log_softmax(self.out(torch.cat((output, context), 1)))` In the **masked_cross_entropy.py** file, it has another log_softmax...