DeepSpeed icon indicating copy to clipboard operation
DeepSpeed copied to clipboard

Installing Ops for using with Pyinstaller

Open Eichhof opened this issue 2 years ago • 1 comments

Describe the bug Currently, the Ops are compiled just-in-time which causes problems with Pyinstaller. How can I prebuild them? Where does DeepSpeed put those extensions and how does it find them? Is it possible to set the path to the extensions manually.?

Expected behavior Prebuilding Ops so that I can include them in the build of Pyinstaller.

ds_report output

DeepSpeed C++/CUDA extension op report

NOTE: Ops not installed will be just-in-time (JIT) compiled at runtime if needed. Op compatibility means that your system meet the required dependencies to JIT install the op.

JIT compiled ops requires ninja ninja .................. [OKAY]

op name ................ installed .. compatible

[WARNING] async_io requires the dev libaio .so object and headers but these were not found. [WARNING] async_io: please install the libaio-dev package with apt [WARNING] If libaio is already installed (perhaps from source), try setting the CFLAGS and LDFLAGS environment variables to where it can be found. async_io ............... [NO] ....... [NO] cpu_adagrad ............ [NO] ....... [OKAY] cpu_adam ............... [NO] ....... [OKAY] fused_adam ............. [NO] ....... [OKAY] fused_lamb ............. [NO] ....... [OKAY] quantizer .............. [NO] ....... [OKAY] random_ltd ............. [NO] ....... [OKAY] [WARNING] sparse_attn requires a torch version >= 1.5 but detected 2.0 [WARNING] using untested triton version (2.0.0), only 1.0.0 is known to be compatible sparse_attn ............ [NO] ....... [NO] spatial_inference ...... [NO] ....... [OKAY] transformer ............ [NO] ....... [OKAY] stochastic_transformer . [NO] ....... [OKAY] transformer_inference .. [NO] ....... [OKAY] utils .................. [NO] ....... [OKAY]

DeepSpeed general environment info: torch install path ............... ['/home/myUsername/anaconda3/envs/huggingface/lib/python3.11/site-packages/torch'] torch version .................... 2.0.0+cu117 deepspeed install path ........... ['/home/myUsername/anaconda3/envs/huggingface/lib/python3.11/site-packages/deepspeed'] deepspeed info ................... 0.8.3, unknown, unknown torch cuda version ............... 11.7 torch hip version ................ None nvcc version ..................... 11.7 deepspeed wheel compiled w. ...... torch 2.0, cuda 11.8

System info (please complete the following information):

  • OS: Ubuntu 22.04 LTS
  • GPU count and types: 1 machine with RTX 3090
  • Hugging Face Transformers/Accelerate/etc. versions: Transformers 4.27.2 and Accelerate 0.17.1
  • Python version: 3.11

Docker context No docker used

Eichhof avatar Apr 10 '23 09:04 Eichhof

@Eichhof - you can pre-build the ops with the directions here. When you install deepspeed, DS_BUILD_OPS=1 pip install deepspeed/./etc. regardless of installing via pip or from source.

loadams avatar Apr 10 '23 21:04 loadams

@Eichhof - if you're seeing issues with this, feel free to re-open this issue or open a new one with the specific issue you're seeing, but installing the ops should be straightforward with the steps above.

loadams avatar Apr 18 '23 23:04 loadams