NLP_scripts icon indicating copy to clipboard operation
NLP_scripts copied to clipboard

Contains notebooks related to various transformers based models for different nlp based tasks

Results 4 NLP_scripts issues
Sort by recently updated
recently updated
newest added

May I know the data location please!

The current code is unable to support multi-GPU environment. I parallelized the model using torch.nn.DataParallel(model).cuda() I got this error when I try : ![16032841374066409239620202407970](https://user-images.githubusercontent.com/11595859/96720897-02b98900-13c9-11eb-9df4-f210062686d9.jpg)

I noticed the fine-tuned roberta script saved the fine-tuned model locally by ``` model_to_save = model torch.save(model_to_save, output_model_file) tokenizer.save_vocabulary(output_vocab_file) ``` How to load this model from the local folder? I...

i am using this code only for multilabel classification of text into 26 labels but getting only 38% hamming score and 28% flat_score. please suggest some changes in my cod...