NExT-GPT
NExT-GPT copied to clipboard
some problem with the combine
The generated file is .satetensors, Instead of .bin file
Same question. When I use the weight like this one for demo_app.py, the NExT-GPT can't generate any multi-modality object and might be some weights missing. I guess this is related to :
same...
Now I have figured it out. It is very likely that the llama weights you select is not appropriate. My former attempt failed at choose a incorrect .pth file at size 13G, and it turns out that three .safetensors files were generated, instead of 2 .bin files. Regarding the selection of llama, you are suggested to choose the ones on huggingface consisiting of two .bin file like this https://huggingface.co/yahma/llama-7b-hf/tree/main. This way, the combined output is likely to be .bin Good Luck!
Hi @Michael4933, I use the huggingface link you suggested, but still get .satetensors files. Did you use the llama parameters from https://huggingface.co/yahma/llama-7b-hf/tree/main to generate bin files? Thank you!
Fixing the transformers version as the requirements.txt addressed this for me.
Hi @Michael4933, I use the huggingface link you suggested, but still get .satetensors files. Did you use the llama parameters from https://huggingface.co/yahma/llama-7b-hf/tree/main to generate bin files? Thank you!
hi, do u know how to fix it now?
Now I have figured it out. It is very likely that the llama weights you select is not appropriate. My former attempt failed at choose a incorrect .pth file at size 13G, and it turns out that three .safetensors files were generated, instead of 2 .bin files. Regarding the selection of llama, you are suggested to choose the ones on huggingface consisiting of two .bin file like this https://huggingface.co/yahma/llama-7b-hf/tree/main. This way, the combined output is likely to be .bin Good Luck!
can I get a copy of yours? Thank u very very much!!!!