intel-extension-for-pytorch
intel-extension-for-pytorch copied to clipboard
Install ipex main branch conda env will meet an error
Describe the issue
I follow this readme to install conda env, and try to build docker or just install from conda. And I always meet an error. This is my steps:
git clone https://github.com/intel/intel-extension-for-pytorch.git
conda create -n llm python=3.10 -y
conda activate llm
cd ./intel-extension-for-pytorch/examples/cpu/inference/python/llm
export PATH=/root/anaconda3/bin/:$PATH
bash ./tools/env_setup.sh
I think it may be caused by the version conflict. This is the error output:
status code: 0
reformatted /root/wangjian/project/test/intel-extension-for-pytorch/intel_extension_for_pytorch/frontend.py
reformatted /root/wangjian/project/test/intel-extension-for-pytorch/intel_extension_for_pytorch/transformers/optimize.py
reformatted /root/wangjian/project/test/intel-extension-for-pytorch/intel_extension_for_pytorch/transformers/models/reference/modules/attentions.py
All done! ✨ 🍰 ✨
3 files reformatted, 125 files left unchanged.
status code: 1
All done! ✨ 🍰 ✨
3 files left unchanged.
status code: 0
reformatted /root/wangjian/project/test/intel-extension-for-pytorch/tests/cpu/test_paged_attention.py
All done! ✨ 🍰 ✨
1 file reformatted, 90 files left unchanged.
status code: 1
ERROR: flake8 found format errors!
/root/wangjian/project/test/intel-extension-for-pytorch/setup.py:236: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
return LooseVersion(line.strip().split(" ")[2])
/root/wangjian/project/test/intel-extension-for-pytorch/setup.py:242: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
_cmake_min_version = LooseVersion("3.13.0")
-- The C compiler identification is GNU 12.3.0
-- The CXX compiler identification is GNU 12.3.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /root/anaconda3/envs/wangjian-test/bin/gcc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /root/anaconda3/envs/wangjian-test/bin/g++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
CMake Error at CMakeLists.txt:34 (find_package):
Could not find a configuration file for package "Torch" that is compatible
with requested version "2.3".
The following configuration files were considered but not accepted:
/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/torch/share/cmake/Torch/TorchConfig.cmake, version: 2.2.0
-- Configuring incomplete, errors occurred!
Traceback (most recent call last):
File "/root/wangjian/project/test/intel-extension-for-pytorch/setup.py", line 1197, in <module>
setup(
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/__init__.py", line 103, in setup
return distutils.core.setup(**attrs)
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 185, in setup
return run_commands(dist)
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 201, in run_commands
dist.run_commands()
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 969, in run_commands
self.run_command(cmd)
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/dist.py", line 989, in run_command
super().run_command(command)
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/wheel/bdist_wheel.py", line 364, in run
self.run_command("build")
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
self.distribution.run_command(command)
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/dist.py", line 989, in run_command
super().run_command(command)
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/_distutils/command/build.py", line 131, in run
self.run_command(cmd_name)
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
self.distribution.run_command(command)
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/dist.py", line 989, in run_command
super().run_command(command)
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/root/wangjian/project/test/intel-extension-for-pytorch/setup.py", line 1166, in run
self.run_command("build_clib")
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
self.distribution.run_command(command)
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/dist.py", line 989, in run_command
super().run_command(command)
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
cmd_obj.run()
File "/root/wangjian/project/test/intel-extension-for-pytorch/setup.py", line 785, in run
_gen_build_cfg_from_cmake(
File "/root/wangjian/project/test/intel-extension-for-pytorch/setup.py", line 617, in _gen_build_cfg_from_cmake
check_call(
File "/root/anaconda3/envs/wangjian-test/lib/python3.10/subprocess.py", line 369, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['cmake', '/root/wangjian/project/test/intel-extension-for-pytorch', '-DBUILD_MODULE_TYPE=CPU', '-DBUILD_WITH_XPU=OFF', '-DCMAKE_BUILD_TYPE=Release', '-DCMAKE_INSTALL_LIBDIR=lib', '-DCMAKE_INSTALL_PREFIX=/root/wangjian/project/test/intel-extension-for-pytorch/build/Release/packages/intel_extension_for_pytorch', '-DCMAKE_PREFIX_PATH=/root/anaconda3/envs/wangjian-test/lib/python3.10/site-packages/torch/share/cmake;/root/anaconda3/envs/wangjian-ipex;/root/anaconda3/envs/wangjian-ipex/x86_64-conda-linux-gnu/sysroot/usr', '-DCMAKE_PROJECT_VERSION=2.3.0', '-DIPEX_PROJ_NAME=intel_extension_for_pytorch', '-DLIBIPEX_GITREV=5ed3a2413', '-DLIBIPEX_VERSION=2.3.0+git5ed3a24', '-DPYTHON_EXECUTABLE=/root/anaconda3/envs/wangjian-test/bin/python', '-DPYTHON_INCLUDE_DIR=/root/anaconda3/envs/wangjian-test/include/python3.10', '-DPYTHON_PLATFORM_INFO=Linux-5.19.17-tdx.v3.6.mvp23.el8.x86_64-x86_64-with-glibc2.28']' returned non-zero exit status 1.
I try using branch v2.1.0 and it can successfully run the llama model. I want to use main branch to run baichuan and chatglm model. Please help me, thank you!
What pytorch version are you using? Have you tried pytorch mainline?
What pytorch version are you using? Have you tried pytorch mainline?
I run this bash to install related libs: bash ./tools/env_setup.sh
. The pytorch version is controlled by the compile_bundle.sh, and the main branch will install version 2.2.0.dev20231213+cpu. Building Intel Extension for PyTorch. Version: 2.3.0+git5ed3a24
will meet the error.
I run this bash to install related libs:
bash ./tools/env_setup.sh
. The pytorch version is controlled by the compile_bundle.sh, and the main branch will install version 2.2.0.dev20231213+cpu.Building Intel Extension for PyTorch. Version: 2.3.0+git5ed3a24
will meet the error.
Thanks. I guess that's the problem we should fix from the mainline. cc @jingxu10
On the other hand, we usually recommend users to use a more stable release version of IPEX instead of the mainline directly. May I know any consideration why you want to use the mainline?
Thanks. I guess that's the problem we should fix from the mainline. cc @jingxu10
On the other hand, we usually recommend users to use a more stable release version of IPEX instead of the mainline directly. May I know any consideration why you want to use the mainline?
Because I want to run Baichuan and chatglm and mistral model which is supported in main branch.
Because I want to run Baichuan and chatglm and mistral model which is supported in main branch.
Perhaps, you can try to hack the following line to work around the problem. I haven't tried locally though.
https://github.com/intel/intel-extension-for-pytorch/blob/5ed3a2413db5f0a5e53bcca0b3e84a814d87bb50/CMakeLists.txt#L33
Change it to:
set(Torch_COMP_VERION "2.2")
Perhaps, you can try to hack the following line to work around the problem. I haven't tried locally though.
I try it and don't meet the error again. But it wll meet a new error after setting up the ENV, the pytorch version does not match the ipex version:
======================================================
Note: Set environment variable "export LD_PRELOAD=/root/anaconda3/envs/wangjian-test/lib/libstdc++.so.6.0.32" to avoid the "version `GLIBCXX_N.N.NN' not found" error.
======================================================
torch_cxx11_abi: False
torch_version: 2.2.0.dev20231213+cpu
ERROR! Intel® Extension for PyTorch* needs to work with PyTorch/libtorch 2.3.*, but PyTorch/libtorch 2.2.0 is found. Please switch to the matching version and run again.
The install torch version:
pip list | grep torch
intel-extension-for-pytorch 2.3.0+git5ed3a24
torch 2.2.0.dev20231213+cpu
Hi, this was a version mismatch issue now it's fixed, please try again with latest master, thanks.