allennlp-template-python-script icon indicating copy to clipboard operation
allennlp-template-python-script copied to clipboard

Unable to load the trained model for inference

Open sreeramjoopudi opened this issue 4 years ago • 0 comments

I have been using allennlp for the last one year and I have successfully trained & ran-inference on these models through config files. Recently, I wanted to train & load models without the usage of config files. I was successfully able to train a model by using allennlp as a library. However, when I tried to load this model in a separate process/separate python script (for inference), I ran into an issue of missing config.json file. The load_archive method of allennlp.models.archival is throwing a missing config.json file error when I point to the output of trained model directory. Can you tell us

  • if this is expected and the way to overcome this is to create a config.json on our own (I believe, it should be possible to create my own config.json if needed for running inference) (or)

  • is there any other way in which I should load the trained model

sreeramjoopudi avatar May 10 '21 19:05 sreeramjoopudi