torchnlp icon indicating copy to clipboard operation
torchnlp copied to clipboard

Easy to use NLP library built on PyTorch and TorchText

Results 8 torchnlp issues
Sort by recently updated
recently updated
newest added

In the paper attention is all you need, the kernel size of convolutions is set to 1. But I find in this implement, this value is 3. Therefore, I am...

Attempting to install torchnlp with 64 bit Windows 10, Python version 3.8.5 and torch 1.6.0 (CPU-only version; no CUDA.) I cloned the repository to a non-system directory off the root...

Hello, hope you are well, and thank you so much for writing this awesome resource! I have a question about the training procedure. When I ran it in command prompt...

i can't find code piece doing this job.

How can I modify transformer for time series analysis,in this case also will you need to use masked attention heads

normalization seems different from the paper #attention is all you need# in paper, normalization layer stays after mha and feed forward layer, in torchnlp, it stays before them x =...

Should it be possible to use only transformers encoder part to train word accentation for Lithuanian language. In Lithuanian language stressing is somewhat tricky as it can vary dependyng on...

Hi @kolloldas, great job with the transformer. I was using your model to run a few basic experiments on sequence labeling and after completing chunking and NER, wanted to move...