text
text copied to clipboard
Models, data loaders and abstractions for language processing, powered by PyTorch
## π Feature Currently, when we call "get_tokenizer()" for any language other than English, we must rely on third party libraries such as SpaCy or NLTK to get these tokenizers....
Currently the release of TorchText 0.18 is blocked by using TorchData 7.1 which is not compatible with PyTorch 2.3 (https://github.com/pytorch/text/actions/runs/8365635868/job/22903939847#step:13:291) Solution: The first proposed solution is to cut a branch...
β¦nel' Cherrypicks 02a5901 to merge it into main. Removes ` .github/workflows/build-conda-macos.yml` to resolve a merge conflict a it was removed from trunk. Can just create another pr to add it...
## β Questions and Help vision of torchtext is 0.16.2 inputs = torchtext.legacy.data.Field(lower=args.lower, tokenize='spacy') answers = torchtext.legacy.data.Field(sequential=False) AttributeError: module 'torchtext' has no attribute 'legacy'
The output dimensions for one of the examples is wrong, since the batch size is 2, not 1. Further, there is no need to import RobertaClassificationHead for that example, as...
## π Bug **Describe the bug** A clear and concise description of what the bug is. Following the [t5_demo](https://pytorch.org/text/stable/tutorials/t5_demo.html), but when it tries to access the CNN data at `...
### π Describe the bug ``` !pip install torch==2.0.0+cu117 torchvision==0.15.1+cu117 torchaudio==2.0.1+cu117 --index-url https://download.pytorch.org/whl/cu117 !pip uninstall -y torchtext !pip install torchtext==0.15.1 !pip install pytorch-lightning==1.5.0 import torchtext as tt print(tt.__version__) import pytorch_lightning...
## π Bug `torchtext` is in maintenance mode, but there's a problem with the current dependencies which I think may warrant an update and minor version bump. This problem causes...
## undefined symbol PyTorch version 2.1.2 I am looking for a version of torchtext that will work with PyTorch 2.1.2. I have tried every version from 0.16.0 to 0.18.0 Each...