transformers-tutorials
transformers-tutorials copied to clipboard
Github repo with tutorials to fine tune transformers for diff NLP tasks
Hi. I tried load the model using `torch.load('model.bin')` but this kind of error occured: `_pickle.UnpicklingError: invalid load key, '['.` And how to do inference? Any help?
https://github.com/abhimishra91/transformers-tutorials/blob/master/transformers_multi_label_classification.ipynb The BertClass forward function is causing the following error message: TypeError: dropout(): argument 'input' (position 1) must be Tensor, not str It looks like the notebook contains version 3...
Hi, First, I want to thank u for this great job. I tried to execute the code of the NER with bert notebook but an error is generated when calculating...
Hi, Thank you for your code, I have a question regarding the way the model is being trained, In the paper it is mentioned T5 is being trained based on...
tr_loss, nb_tr_steps, nb_tr_examples are not defined or initialized before updating them in the loop of the valid() function.
I am running this notebook `transformers_multiclass_classification.ipynb` and I do not get the same outputs. For example, the training loss goes straight to 0 (very close to 0) in this notebook....
re: transformers_multiclass_classification.ipynb Thank you for this helpful tutorial! It seems to work well when the batch size (either for training or validation) is a factor of the number of examples,...
Great tutorial, but I had to struggle with the latest `transformers` library. After some research, this worked for me.
How do I save the fine-tuned model using this notebook https://github.com/abhimishra91/transformers-tutorials/blob/master/transformers_summarization_wandb.ipynb Saving it locally or in huggingface for future use.
hello @abhimishra91 i was trying to implement the fine tuning of T5 as explained in your [notebook](https://github.com/abhimishra91/transformers-tutorials/blob/master/transformers_summarization_wandb.ipynb). in addition to have implemented the same structure as you, i have made...