LLaVA icon indicating copy to clipboard operation
LLaVA copied to clipboard

[Question] ImportError: cannot import name 'LlavaLlamaForCausalLM' from 'llava.model' (/root/LLaVA/llava/model/__init__.py)

Open 20191864218 opened this issue 5 months ago • 7 comments

Question

If I introduce a new package in clip_encoder.py, I get this error. What should I do?Thanks!

20191864218 avatar Mar 01 '24 04:03 20191864218

See this thread: https://github.com/haotian-liu/LLaVA/issues/1101 Basically re-install flash-attn can solve this error.

pip uninstall flash-attn 
pip install flash-attn --no-build-isolation --no-cache-dir

zzxslp avatar Mar 07 '24 02:03 zzxslp

See this thread: #1101 Basically re-install flash-attn can solve this error.

pip uninstall flash-attn 
pip install flash-attn --no-build-isolation --no-cache-dir

thinks!

20191864218 avatar Mar 07 '24 06:03 20191864218

See this thread: #1101 Basically re-install flash-attn can solve this error.

pip uninstall flash-attn 
pip install flash-attn --no-build-isolation --no-cache-dir

thinks!

See this thread: #1101 Basically re-install flash-attn can solve this error.

pip uninstall flash-attn 
pip install flash-attn --no-build-isolation --no-cache-dir

the question still persist after the commands

hantao-zhou avatar Mar 08 '24 20:03 hantao-zhou

Hi hi~ I met the same issue when adding new modules, have you sucessfully solved this problem?

SuperStacie avatar Apr 09 '24 17:04 SuperStacie

Hi hi~ I met the same issue when adding new modules, have you sucessfully solved this problem?

不用管__init__.py文件,在需要导入__init__.py的地方直接把相应的文件导入

20191864218 avatar Apr 10 '24 00:04 20191864218

Recently being tortured by some other tedious job~ just saw the updates My issue was due to a path pointing to an environment controlled by the system and causing conflicts, so after several printenv, I solved it by correcting the referrals

hantao-zhou avatar Apr 10 '24 09:04 hantao-zhou

In llava/init.py, I modify the code from .model import LlavaLlamaForCausalLM to from .model.language_model.llava_llama import LlavaLlamaForCausalLM and fix it. I'm not sure if it makes effort for someone else, but I think I should share it.

foreverhell avatar Apr 24 '24 06:04 foreverhell