Diffusion-3D-Features icon indicating copy to clipboard operation
Diffusion-3D-Features copied to clipboard

Environment setup in README does not produce a correct environment to run test_correspondence.ipynb

Open ycjungSubhuman opened this issue 10 months ago • 0 comments

Thanks for sharing the code. I struggled with the environment setup for a while, so I'm sharing my experience here and describing how I solved it.

What I did

I followed README: 1. create env with conda env create -f environment.yaml. 2. install pytorch3d with

import sys
import torch
pyt_version_str=torch.__version__.split("+")[0].replace(".", "")
version_str="".join([
    f"py3{sys.version_info.minor}_cu",
    torch.version.cuda.replace(".",""),
    f"_pyt{pyt_version_str}"
])
!pip install fvcore iopath
!pip install --no-index --no-cache-dir pytorch3d -f https://dl.fbaipublicfiles.com/pytorch3d/packaging/wheels/{version_str}/download.html

I'm using Arch Linux and miniconda3.

What is expected

The procedure described above should create a correct conda env for running test_correspondence.ipynb

What happened

  1. Some scary warnings: In the first cell (imports) in test_correspondence.ipynb,
WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for:
    PyTorch 2.1.0.post100 with CUDA None (you have 2.1.0+cu121)
    Python  3.10.13 (you have 3.10.14)
  Please reinstall xformers (see https://github.com/facebookresearch/xformers#installing-xformers)
  Memory-efficient attention, SwiGLU, sparse and more won't be available.
  Set XFORMERS_MORE_DETAILS=1 for more details
  1. Runtime error for running
pipe = init_pipe(device)
dino_model = init_dino(device)

in cell 5 due to absence of accelerate package.

  1. Runtime error for running
f_source = compute_features(device, pipe, dino_model, source_mesh, "cow")

in cell 7 due to xformers not being installed correctly

File ~/miniconda3/envs/diff3f/lib/python3.10/site-packages/xformers/ops/fmha/dispatch.py:63, in _run_priority_list(name, priority_list, inp)
     61 for op, not_supported in zip(priority_list, not_supported_reasons):
     62     msg += "\n" + _format_not_supported_reasons(op, not_supported)
---> 63 raise NotImplementedError(msg)

NotImplementedError: No operator found for `memory_efficient_attention_forward` with inputs:
     query       : shape=(1, 1370, 12, 64) (torch.float32)
     key         : shape=(1, 1370, 12, 64) (torch.float32)
     value       : shape=(1, 1370, 12, 64) (torch.float32)
     attn_bias   : <class 'NoneType'>
     p           : 0.0
`decoderF` is not supported because:
    xFormers wasn't build with CUDA support
    requires device with capability > (8, 0) but your GPU has capability (7, 5) (too old)
    attn_bias type is <class 'NoneType'>
    operator wasn't built - see `python -m xformers.info` for more info
`[email protected]` is not supported because:
    xFormers wasn't build with CUDA support
    requires device with capability > (8, 0) but your GPU has capability (7, 5) (too old)
    dtype=torch.float32 (supported: {torch.bfloat16, torch.float16})
    operator wasn't built - see `python -m xformers.info` for more info
`tritonflashattF` is not supported because:
    xFormers wasn't build with CUDA support
    requires device with capability > (8, 0) but your GPU has capability (7, 5) (too old)
    dtype=torch.float32 (supported: {torch.bfloat16, torch.float16})
    operator wasn't built - see `python -m xformers.info` for more info
    triton is not available
    requires GPU with sm80 minimum compute capacity, e.g., A100/H100/L4
    Only work on pre-MLIR triton for now
`cutlassF` is not supported because:
    xFormers wasn't build with CUDA support
    operator wasn't built - see `python -m xformers.info` for more info
`smallkF` is not supported because:
    max(query.shape[-1] != value.shape[-1]) > 32
    xFormers wasn't build with CUDA support
    operator wasn't built - see `python -m xformers.info` for more info
    unsupported embed per head: 64

What I think is the problem

The setup does not configure accelerate and xformers correctly.

Suggested solution

Install accelerate and xformers correctly. I configured a docker image to build such environment.

ycjungSubhuman avatar Apr 09 '24 09:04 ycjungSubhuman