LLaVA
LLaVA copied to clipboard
[Question] Should we use HF_HOME instead of TRANSFORMERS_CACHE?
Question
I implemented LLaVA v1.2.2 with Transformers v4.36.2. The following warning message was outputted.
/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py:123: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.
warnings.warn(
You are using a model of type llava to instantiate a model of type llava_llama. This is not supported for all configurations of models and can yield errors.
i also meet this bug, have you solve it,bro?