transformers-tutorials
transformers-tutorials copied to clipboard
Github repo with tutorials to fine tune transformers for diff NLP tasks
In your custom data loader: ```python class CustomDataset(Dataset): def __init__(self, tokenizer, sentences, labels, max_len): self.len = len(sentences) self.sentences = sentences self.labels = labels self.tokenizer = tokenizer self.max_len = max_len def...
in function train, ```outputs = model(input_ids = ids, attention_mask = mask, decoder_input_ids=y_ids, lm_labels=lm_labels)```. param `lm_labels` should be labels in new version of transformers
When running the main function in CoLab, got the following error AttributeError: 'NoneType' object has no attribute 'batch_encode_plus' When I tried this, `tokenizer = T5Tokenizer.from_pretrained("t5-base")` `print(type(tokenizer))` got,
Hi, There is a `class BertForMultipleChoice` in `modeling_bert.py` in transformers library. Does it do the same as the model in notebook? Thanks!
Hi , Firstly great tutorial . The multiclass model tutorial really helped me in understanding how to leverage BERT . I have my model saved as shown post training. However...
Hi, I am new to the field of NLP. I wanna do the fine-tuning of MarianMT pretrained models for deutsch to english translation. But, i am not sure how to...
In the 'train' function of transformers_summarization_wandb.ipynb, 'lm_labels' is regarded as parameters. However, it doesn't exist. Instead, 'labels' is the correct parameters. I have tried this when I am finishing my...
In this example we use one feature, how would I adjust the class to have more than one feature?
in data folder i see only the readme, but the train.csv folder there isn't.
The earlier code was giving error for padding and the reason behind it was the parameter name `lm_labels` as the newer version expects the parameter name to be just `labels`....