allennlp-template-python-script
allennlp-template-python-script copied to clipboard
Unable to load the trained model for inference
I have been using allennlp for the last one year and I have successfully trained & ran-inference on these models through config files. Recently, I wanted to train & load models without the usage of config files. I was successfully able to train a model by using allennlp as a library. However, when I tried to load this model in a separate process/separate python script (for inference), I ran into an issue of missing config.json file. The load_archive method of allennlp.models.archival is throwing a missing config.json file error when I point to the output of trained model directory. Can you tell us
-
if this is expected and the way to overcome this is to create a
config.jsonon our own (I believe, it should be possible to create my ownconfig.jsonif needed for running inference) (or) -
is there any other way in which I should load the trained model