pytorch-seq2seq icon indicating copy to clipboard operation
pytorch-seq2seq copied to clipboard

Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.

Results 55 pytorch-seq2seq issues
Sort by recently updated
recently updated
newest added

I used the default xavier method to init params and the model converged as loss dropped from ~4 to

I have trained a transformer encoder-decoder model by replacing the encoder with some pre-trained model and putting decoder-related code (Tutorial 6 Attention is all you need) on top of the...

Thank you for providing such a detailed tutorial. I updated the preprocessing pipeline for torchtext 0.13 by replacing Field and BucketIterator with get_tokenizer and Dataloader according to the official torchtext...

I am trying to work on my own data in a txt file the source and target sentences are separated by a tab. The problem is I'm not able to...

Hi Ben! Thanks for sharing this wonderful tutorial! I have a question about the training script of [5. Convolutional Seq2Seq](https://github.com/bentrevett/pytorch-seq2seq/blob/master/5%20-%20Convolutional%20Sequence%20to%20Sequence%20Learning.ipynb). In `train()`, the model accepts `src` and `trg[:,:-1]` to _drop_...

The recent version of torchtext 0.12.0 doesn't support Field, BuckeIterator, etc. What is the equivalent modules to pre-process the datasets like Multi30k, IWSLT2016, IWSLT2017 etc? Thanks.

the output shape of the embeding layer is batch size,src len,emb dim, not embedded = [src len, batch size, emb dim] ???

Thank you for this awesome repo you have made public. I had one question, during the training loop, you perform the following step ` output, _ = model(src, trg[:,:-1])` I...

Had to add ``` import spacy.cli spacy.cli.download("de_core_news_sm") ``` else it would not load `de_core_news_sm` See https://stackoverflow.com/questions/62822737/oserror-e050-cant-find-model-de-it-doesnt-seem-to-be-a-shortcut-link-a