xformers icon indicating copy to clipboard operation
xformers copied to clipboard

[feat] add python3.12 support

Open Mon-ius opened this issue 1 year ago • 6 comments

python 3.12 has unlock more power of python, and now stable with latest version 3.12.2.

Besides, mainstream repo including pytorch torchvision huggingface_hub transformers accelerate diffusers has already support it. The xformers is supposed to support it as well. 🤗

Mon-ius avatar Apr 03 '24 16:04 Mon-ius

Hello, facing this issue today, builds fine on Intel/4090 rig in py311 but hangs for a long time and then fails in py312 because of errors like this, hard to tell if it's my rig or what but it only happens in p312 and there is a huge wall of these messages

  × python setup.py bdist_wheel did not run successfully.
  │ exit code: 1
  ╰─> [1145 lines of output]
      fatal: not a git repository (or any of the parent directories): .git
...
      /tmp/pip-install-9_3rfhla/xformers_19eb84c6b3d04ce896977f0ca50e1d3b/third_party/flash-attention/csrc/flash_attn/flash_api.cpp: In function ‘void set_params_fprop(Flash_fwd_params&, size_t, size_t, size_t, size_t, size_t, size_t, size_t, size_t, size_t, at::Tensor, at::Tensor, at::Tensor, at::Tensor, void*, void*, void*, void*, void*, float, float, int, int, bool)’:
      /tmp/pip-install-9_3rfhla/xformers_19eb84c6b3d04ce896977f0ca50e1d3b/third_party/flash-attention/csrc/flash_attn/flash_api.cpp:49:11: warning: ‘void* memset(void*, int, size_t)’ clearing an object of non-trivial type ‘struct Flash_fwd_params’; use assignment or value-initialization instead [-Wclass-memaccess]
         49 |     memset(&params, 0, sizeof(params));
            |     ~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~
      In file included from /tmp/pip-install-9_3rfhla/xformers_19eb84c6b3d04ce896977f0ca50e1d3b/third_party/flash-attention/csrc/flash_attn/flash_api.cpp:13:
      /tmp/pip-install-9_3rfhla/xformers_19eb84c6b3d04ce896977f0ca50e1d3b/third_party/flash-attention/csrc/flash_attn/src/flash.h:51:8: note: ‘struct Flash_fwd_params’ declared here
         51 | struct Flash_fwd_params : public Qkv_params {
            |        ^~~~~~~~~~~~~~~~

Any advice on building or could we publish some prebuilt wheel for py312? it's good to keep the maximum supported python version pegged to the most recent one always, otherwise it's a crapshoot if stuff works

bionicles avatar Jun 06 '24 15:06 bionicles

Hi, We plan to add prebuilt wheels for py312 when PyTorch 2.4.0 is released (expected end of july)

danthe3rd avatar Jun 11 '24 14:06 danthe3rd

@danthe3rd great! if both conda and pip issue will be solved in that time?

Mon-ius avatar Jun 12 '24 00:06 Mon-ius

Yes we plan to add 3.12 for both :)

danthe3rd avatar Jun 12 '24 07:06 danthe3rd

please keep us posted if it winds up working ahead of schedule, I am excited about the py312 type system improvements for projects where i dont need to worry about backwards compatibility. The "type" keyword is crucial to distinguish variables and types and also enables us to essentially define new classes with one-liners (!) https://docs.python.org/3.12/whatsnew/3.12.html

Also, we're about to see py313 with potentially significant python performance enhancements, so it could be a good idea to think about how we might need to adjust code for python 3.13 as well in order to unlock those benefits sooner after that version is released https://docs.python.org/3.13/whatsnew/3.13.html

bionicles avatar Jun 12 '24 10:06 bionicles

The PT 2.4.0 is actually a requirement to have smaller build sizes, which we need to be able to host more versions (eg py312 etc...), so I don't expect us to be ahead of schedule there. For python 3.13, we will have to wait for PyTorch to support it first.

danthe3rd avatar Jun 12 '24 11:06 danthe3rd

We now have wheels for py312. Closing

danthe3rd avatar Aug 26 '24 15:08 danthe3rd

@danthe3rd cheers, how about conda 🤗

Mon-ius avatar Aug 27 '24 11:08 Mon-ius