LLMZoo
LLMZoo copied to clipboard
Load checkpoint in inference
I trained the model and save the checkpoint successfully. After that, how could I load the checkpoint in inference? I tried to use the checkpoint path as the model path but get "does not appear to have a file named config.json."
Dear @fitexmage,
Could you provide more details, e.g., the Transformers version? Is there any config.json in the specified path?
Note that the path should be the directory to the checkpoint instead of the ".bin" path.
Best, Zhihong
Hi @zhjohnchan, Thanks for your response. After posting my issue, I tried to install a lower version of transformers (4.28.0) and it works! Transformers higher than 4.29 seems not have a loadable output.
Same error!
Hi @zhjohnchan, Thanks for your response. After posting my issue, I tried to install a lower version of transformers (4.28.0) and it works! Transformers higher than 4.29 seems not have a loadable output.
Hi, how you solve it? A lower version of transformers does not work.
Hi @zhjohnchan, Thanks for your response. After posting my issue, I tried to install a lower version of transformers (4.28.0) and it works! Transformers higher than 4.29 seems not have a loadable output.
Hi, how you solve it? A lower version of transformers does not work.
Hi, after saving the model, I just change the model path to the checkpoint and it works, like python -m llmzoo.deploy.cli --model-path checkpoints/phoenix_7b/. "config.json" should occurred in the checkpoint if transformers is 4.28.0.