LGM icon indicating copy to clipboard operation
LGM copied to clipboard

No module named 'mvdream.pipeline_mvdream'

Open krizalid38 opened this issue 1 year ago • 2 comments

when i try to run test i get an error like this .

(D:\condaenv\LGM) D:\ai_model>python infer.py big --resume pretrained/model_fp16.safetensors --workspace workspace_test --test_path data_test A matching Triton is not available, some optimizations will not be enabled Traceback (most recent call last): File "D:\condaenv\LGM\Lib\site-packages\xformers_init_.py", line 55, in _is_triton_available from xformers.triton.softmax import softmax as triton_softmax # noqa ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\condaenv\LGM\Lib\site-packages\xformers\triton\softmax.py", line 11, in import triton ModuleNotFoundError: No module named 'triton' D:\ai_model\core\attention.py:22: UserWarning: xFormers is available (Attention) warnings.warn("xFormers is available (Attention)") Traceback (most recent call last): File "D:\ai_model\infer.py", line 21, in from mvdream.pipeline_mvdream import MVDreamPipeline ModuleNotFoundError: No module named 'mvdream.pipeline_mvdream'

i am running on windows 10.

thanks

krizalid38 avatar Feb 09 '24 07:02 krizalid38

@krizalid38 Hi, it seems the xformers installation is failed, could you try to re-install it and check the logs?

ashawkey avatar Feb 10 '24 03:02 ashawkey

Hi, if you install xformers by pip install -U xformers --index-url https://download.pytorch.org/whl/cu118 it'll install xformers for torch2.2.0+cu118,so you may need to update your torch to this version.

FisherYuuri avatar Mar 01 '24 02:03 FisherYuuri