pytorch-transformers-classification
pytorch-transformers-classification copied to clipboard
How to make predications
Can you teach me how to make predications after model is trained? Does it come with build in method like .predict()?
Please use Simple Transformers as this repo is now outdated.
To answer your question, this repo does not have such a built-in function whereas Simple Transformers does. If you have already trained a model, you can use it with Simple Transformers as well.
With this repo, you would have to make predictions manually following the same procedure as for evaluation.
Thanks for the advice. If so, how can I save a model from simple transformers and export the model for real-world usage
from simpletransformers.classification import ClassificationModel
model = ClassificationModel('roberta', '<path_to_directory_containing_trained_model>')
model.predict([<list_of_text>])
Thank you very much! Just want to confirm, if showes you can save the model using train_model method?
After the model being save, how am I supposed to to load the model? Is there a build in method for that, as I didn't see it on the tutorial
by '<path_to_directory_containing_trained_model>
in model = ClassificationModel('roberta', '<path_to_directory_containing_trained_model>')
you mean we shoud just put "output" folder that is generated during training as the directory? I trained the bert model using run_model.py and now need to make preditions for unlabeled dataset I have.
2. We should put these lines of code at the end after the evaluation. right?
I would highly appreciate if you advise me on this.
This answer was regarding the Simple Transformers library.
I am not sure whether you are referring to Simple Transformers or this repo. Please open an issue on Simple Transformers if it is related to that.
I trained the bert model using "run_model.py" in this repository and now need to make predictions for an unlabeled dataset I have. I somehow got confused about how I can make predictions. Would you please advise me on that? Thank you.