How to load the fine-tuned model for inference?
Thank you for your excellent work. I followed the fine-tuning instructions you provided: make experiments/ssp_2m_mini_bbox_base/.done_finetune to train my custom dataset. However, I encountered difficulties when trying to load the fine-tuned model for inference. Firstly, this command should have trained a base model. Therefore, I modified the parameters based on the Jupyter pipeline inference process.
Unfortunately, I encountered an error during loading.
I noticed that the file sizes of unitable_large_content.pt and ssp_2m_large.pt are significantly different, indicating that ssp_2m_large.pt or ssp_2m_base.pt may not be suitable for loading using the provided pipeline method.
Could you please provide a guide on how to load the fine-tuned model for inference? I am looking forward to your reply. Thank you very much!
Thank you for your excellent work. I followed the fine-tuning instructions you provided:
make experiments/ssp_2m_mini_bbox_base/.done_finetuneto train my custom dataset. However, I encountered difficulties when trying to load the fine-tuned model for inference. Firstly, this command should have trained a base model. Therefore, I modified the parameters based on the Jupyter pipeline inference process.
Unfortunately, I encountered an error during loading.
I noticed that the file sizes of unitable_large_content.pt and ssp_2m_large.pt are significantly different, indicating that ssp_2m_large.pt or ssp_2m_base.pt may not be suitable for loading using the provided pipeline method.
Could you please provide a guide on how to load the fine-tuned model for inference? I am looking forward to your reply. Thank you very much!
I had the same error. Anyone solved?
Thank you for your excellent work. I followed the fine-tuning instructions you provided:
make experiments/ssp_2m_mini_bbox_base/.done_finetuneto train my custom dataset. However, I encountered difficulties when trying to load the fine-tuned model for inference. Firstly, this command should have trained a base model. Therefore, I modified the parameters based on the Jupyter pipeline inference process.
Unfortunately, I encountered an error during loading.
I noticed that the file sizes of unitable_large_content.pt and ssp_2m_large.pt are significantly different, indicating that ssp_2m_large.pt or ssp_2m_base.pt may not be suitable for loading using the provided pipeline method.
Could you please provide a guide on how to load the fine-tuned model for inference? I am looking forward to your reply. Thank you very much!
Hello, may I ask you how you perform model inference and which script you use for it?