LLaVA
LLaVA copied to clipboard
[Question] ImportError: cannot import name 'LlavaLlamaForCausalLM' from 'llava.model' (/root/LLaVA/llava/model/__init__.py)
Question
If I introduce a new package in clip_encoder.py, I get this error. What should I do?Thanks!
See this thread: https://github.com/haotian-liu/LLaVA/issues/1101 Basically re-install flash-attn can solve this error.
pip uninstall flash-attn
pip install flash-attn --no-build-isolation --no-cache-dir
See this thread: #1101 Basically re-install flash-attn can solve this error.
pip uninstall flash-attn pip install flash-attn --no-build-isolation --no-cache-dir
thinks!
See this thread: #1101 Basically re-install flash-attn can solve this error.
pip uninstall flash-attn pip install flash-attn --no-build-isolation --no-cache-dir
thinks!
See this thread: #1101 Basically re-install flash-attn can solve this error.
pip uninstall flash-attn pip install flash-attn --no-build-isolation --no-cache-dir
the question still persist after the commands
Hi hi~ I met the same issue when adding new modules, have you sucessfully solved this problem?
Hi hi~ I met the same issue when adding new modules, have you sucessfully solved this problem?
不用管__init__.py文件,在需要导入__init__.py的地方直接把相应的文件导入
Recently being tortured by some other tedious job~ just saw the updates My issue was due to a path pointing to an environment controlled by the system and causing conflicts, so after several printenv, I solved it by correcting the referrals
In llava/init.py, I modify the code from .model import LlavaLlamaForCausalLM to from .model.language_model.llava_llama import LlavaLlamaForCausalLM and fix it. I'm not sure if it makes effort for someone else, but I think I should share it.
See this thread: #1101 Basically re-install flash-attn can solve this error.
pip uninstall flash-attn pip install flash-attn --no-build-isolation --no-cache-dir
this one addressed my issue (which might be caused also by not creating a separated conda env)