First python import torch&ipex take a long time on Ultra 7 155H
Hi
I'm running the ipex-llm with "ipex-2.1.10+xpu" wheel with python 3.10 on Ultra 155H laptop with Win11.
Issue 1.
Everytime I setup the ipex-llm on a laptop that has been restored to factory settings (to make sure the OS is clean) and run the inference python sample, the first importing torch&ipex libraries takes almost 60 seconds. This does not happen if I run the code again, the importing time is 3 seconds.
Here is my reproduce code
And the console output:
Issue 2.
And if I copy the conda env to another laptop directly without the pip installation process ( for fast deployment purpose), the first importing time will be 120 seconds
Any idea to why the first importing time is so slow and how can I reduce the time?
PS: Here is the IPEX-LLM setup process
conda create -n ipex-llm-online python=3.10 libuv
conda activate ipex-llm-online
pip config set global.index-url https://pypi.tuna.tsinghua.edu.cn/simple
pip config list
pip install dpcpp-cpp-rt==2024.0.2 mkl-dpcpp==2024.0.0 onednn==2024.0.0
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
and the reproduce code is
print('************* start import time:')
import time
print('************* start import torch:'+ time.asctime( time.localtime(time.time()) ))
import torch
print('************* start import argparse:' + time.asctime( time.localtime(time.time()) ))
import argparse
print('************* start:import AutoModelForCausalLM' + time.asctime( time.localtime(time.time()) ))
from ipex_llm.transformers import AutoModelForCausalLM
print('************* import AutoModelForCausalLM ready:' + time.asctime( time.localtime(time.time()) ))
from transformers import AutoTokenizer
print('************* import AutoTokenizer ready:' + time.asctime( time.localtime(time.time()) ))
asdfasdf
Hi @jianjungu,
We are currently reproducing and investigating on this issue, and will let you know when we have results. :)
Hi @jianjungu,
On our U5 125H, we have not reproduced the issue in a newly-created conda env with results:
- First time of import
- Second time of import
Test env
conda create -n test-import-py310 python=3.10 libuv
pip install dpcpp-cpp-rt==2024.0.2 mkl-dpcpp==2024.0.0 onednn==2024.0.0
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
Test code
import time
st = time.time()
import torch
print(f"torch import: {time.time() - st} s")
st = time.time()
import intel_extension_for_pytorch as ipex
print(f"ipex import: {time.time() - st} s")
st = time.time()
from ipex_llm.transformers import AutoModelForCausalLM
print(f"ipex_llm.transformers.AutoModelForCausalLM import: {time.time() - st} s")
However, it may due to that we are not using a clean machine with factory settings. It would be helpful for us to conduct some further tests on your clean machine :)