How to load convnext model using local bin file
Hello, thanks for your work.
But I would like to ask if I want the convnext model to use local files instead of downloading them every time. How should this be done? I tried modifying the code, but to no avail.
Could you please give me some guidance? Thank you very much.
I have the same problem
I have the same problem
Download the weights from here, then change the name --vision_tower_slow convnext_large_mlp.clip_laion2b_ft_320 to your download path, e.g., ./weights/convnext_large_mlp.clip_laion2b_ft_320 .
I did it exactly the way you said, but it didn't work
This is because your network cannot connect to the huggingface server. As far as I know, even loading weights locally may still require access to the huggingface server. I recommend you to use a VPN such as pigcha.
clip_laion2b_ft_320
which model/weights should we download exactly?because there are so many choices.
我完全按照你说的做了,但没有效果
![]()
Did you solve this problem?
