intel-extension-for-transformers icon indicating copy to clipboard operation
intel-extension-for-transformers copied to clipboard

Installed intel-extension-for-transformers and I get an error - No module named 'intel_extension_for_pytorch'"

Open sungkim11 opened this issue 1 year ago • 6 comments
trafficstars

I am using Arc770 GPU on Windows 11

  1. I have installed WSL2
  2. I have installed miniconda
  3. I follow instruction - "pip install intel-extension-for-transformers"
  4. Run the example GPU code and I get an error message: "Exception has occurred: ModuleNotFoundError No module named 'intel_extension_for_pytorch'"

sungkim11 avatar Feb 01 '24 06:02 sungkim11

hi @sungkim11 , you could refer to gpu instruction to enable WOQ on intel GPU

airMeng avatar Feb 01 '24 07:02 airMeng

Why is this asking for username/password:

python -m pip install torch==2.1.2 -f https://developer.intel.com/ipex-whl-stable-xpu

source /opt/intel/oneapi/setvars.sh

git clone https://github.com/intel-innersource/frameworks.ai.pytorch.ipex-gpu.git ipex-gpu cd ipex-gpu git checkout -b dev/QLLM origin/dev/QLLM git submodule update --init --recursive

Pip install -r requirements.txt python setup.py install

sungkim11 avatar Feb 01 '24 07:02 sungkim11

I cannot get pass this step. Any help?

sungkim11 avatar Feb 01 '24 19:02 sungkim11

Why is this asking for username/password:

python -m pip install torch==2.1.2 -f https://developer.intel.com/ipex-whl-stable-xpu

source /opt/intel/oneapi/setvars.sh

git clone https://github.com/intel-innersource/frameworks.ai.pytorch.ipex-gpu.git ipex-gpu cd ipex-gpu git checkout -b dev/QLLM origin/dev/QLLM git submodule update --init --recursive

Pip install -r requirements.txt python setup.py install

sorry, @sungkim11, our fault, should be https://github.com/intel/intel-extension-for-pytorch.git, have updated in https://github.com/intel/intel-extension-for-transformers/pull/1240, please try again

airMeng avatar Feb 02 '24 00:02 airMeng

Error:

UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend. warnings.warn(msg.format('we could not find ninja.')) running build_clib WARNING: Please install flake8 by pip install -r requirements-flake8.txt to check format!

sungkim11 avatar Feb 02 '24 02:02 sungkim11

Error:

UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend. warnings.warn(msg.format('we could not find ninja.')) running build_clib WARNING: Please install flake8 by pip install -r requirements-flake8.txt to check format!

these are just some warnings, can you paste the errors here? or the full log?

airMeng avatar Feb 02 '24 07:02 airMeng