JBurtn
JBurtn
Need to Wait for / help with #4888 and #4942 before this can be implemented. Maybe even some more stuff.
Also why manually replace the files? Why not use monkeypatching, or reregister your versions with transformers using exist_ok=True?
> Abnormal error screenshot fp16 bf16 can be converted to each other at any time, it should not be a problem of data type Afaik, It can be a problem,...
also comment out standardize_cache_format=standardize_cache_format. Later versions remove this parameter and cog only ever references it here AFIAK
Even simpler example. from transformers import AutoConfig model_id = "Efficient-Large-Model/VILA1.5-40b" config = AutoConfig.from_pretrained(model_id, trust_remote_code=True)# Error Here print(config)
I copied what I needed from run_vila.py and it worked. if you do from VILA.llava.model import * it should fix the llava_llama issue. It still complains about missing weights (even...
I'm curious too. I think @ROIM1998 is talking about here: https://github.com/NVlabs/VILA/blob/da98f3b98191540bbc52a9feea7102e1268b9c4c/llava/model/language_model/builder.py#L87-L108 There is also "tokenizer_padding_side": "right" from the config.json