Your demo code on HuggingFace is throwing 502 Gateway error
Please provide instruction son how to evaluate AlphaCLIP MLLM model.
the openxlab resource is limited, the source code is available in demo, you can run it locally.
Hey SunzeY,
Your demo with_llm is throwing the following error. I'm running the code on rocm/pytorch :rocm6.1.2_ubuntu20.04_py3.9_pytorch_release-2.1.2 docker image on AMD GPU
ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' (/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bloom/modeling_bloom.py)
maybe the torchvision version is not matched? You should make sure to be able to run llava original codes first in your environment before testing alpha-clip. any environmental problem can be better solved in LLaVA repo.
Also, earlier I got an error at https://github.com/SunzeY/AlphaCLIP/blob/main/demo/with_llm/llava/model/language_model/llava_llama.py#L139
Llava is already used by transformers model.
Is it expected that I change the registration name on L#139 when I run the demo?
you can change transformers version to 4.36.2, as newer package already include llava in official model. If nothing went wrong after changing the name, it's also fine.
After changing the name I got the below error on transformers 4.42+
ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' (/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bloom/modeling_bloom.py)
Let me try with a different version now. Did you happen to get a chance to run a demo on AMD hardware? Otherwise, I'd be happy to help.
nop