Yi
Yi copied to clipboard
When using web-demo on VL-6B
Reminder
- [X] I have searched the Github Discussion and issues and have not found anything similar to this.
Environment
- OS:WSL ubuntu20.04
- Python:3.10
- PyTorch:2.1.2
- CUDA:12.2
Current Behavior
Hi expert:
When use the web_demo, why release this error?
Thanks
python web_demo.py -c "/home/root123/aml/Yi-VL-6B"
Traceback (most recent call last):
File "/home/root123/aml/Yi/demo/web_demo.py", line 211, in
Expected Behavior
No response
Steps to Reproduce
1.WSL 2.using Yi-VL-6B 3.using web.demo
Anything Else?
no
I was unable to reproduce this error, but it might be because I was using a different OS. Would you please be more specific on your steps to reproduce the error?
BTW, I believe you meant --model-path
instead of -c
in your code, if not please try using --model-path
to specify the model path.
I was unable to reproduce this error, but it might be because I was using a different OS. Would you please be more specific on your steps to reproduce the error? BTW, I believe you meant
--model-path
instead of-c
in your code, if not please try using--model-path
to specify the model path.
yes -c, and this problem also occurs on A100 Ubuntu 18.04, torch 2.1 cuda 12.1
root@A100:/aml/Yi/demo# python web_demo.py -c '/aml2/Yi-VL-6B' --share
Traceback (most recent call last):
File "/aml/Yi/demo/web_demo.py", line 211, in
BTW ,single inference is OK for working
I was unable to reproduce this error, but it might be because I was using a different OS. Would you please be more specific on your steps to reproduce the error? BTW, I believe you meant
--model-path
instead of-c
in your code, if not please try using--model-path
to specify the model path.yes -c, and this problem also occurs on A100 Ubuntu 18.04, torch 2.1 cuda 12.1
root@A100:/aml/Yi/demo# python web_demo.py -c '/aml2/Yi-VL-6B' --share Traceback (most recent call last): File "/aml/Yi/demo/web_demo.py", line 211, in model = AutoModelForCausalLM.from_pretrained( File "/aml/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 569, in from_pretrained raise ValueError( ValueError: Unrecognized configuration class <class 'transformers.models.llava.configuration_llava.LlavaConfig'> for this kind of AutoModel: AutoModelForCausalLM. Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, LlamaConfig, CodeGenConfig, CpmAntConfig, CTRLConfig, Data2VecTextConfig, ElectraConfig, ErnieConfig, FalconConfig, FuyuConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, LlamaConfig, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MistralConfig, MixtralConfig, MptConfig, MusicgenConfig, MvpConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PersimmonConfig, PhiConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, Qwen2Config, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, Speech2Text2Config, TransfoXLConfig, TrOCRConfig, WhisperConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig.
BTW ,single inference is OK for working
While I was running -c
, I found this error:
web_demo.py: error: unrecognized arguments: -c /content/Yi-VL-6B
I don't think -c
is a valid argument for web_demo.py, have you modified any files in the repo?
It dues to my fault, not by the code, sorry, please close it
It dues to my fault, not by the code, sorry, please close it
how do you solve this problem? I meet it too.
It dues to my fault, not by the code, sorry, please close it
how do you solve this problem? I meet it too.
you need python run the same name file(webdemo.py)under the file path (/VL)
It dues to my fault, not by the code, sorry, please close it
how do you solve this problem? I meet it too.
you need python run the same name file(webdemo.py)under the file path (/VL)
Thank u. It works.