Wallace Lee
Wallace Lee
I'm getting the following error when trying to run smartlab demo. The latest 2022.1 OV runtime is installed. Any ideas how i can fix this ? _Traceback (most recent call...
Hi, i encounter the following error message trying to enable flash attention when running the command below. Can i know is flash attention supported ? ``command: ./main -m $model -n...
hi how to get llama-3.2 to work with ipex_llm ? here's my code. ``` import requests import torch from PIL import Image from transformers import MllamaForConditionalGeneration, AutoProcessor import torch from...
### Describe the issue Hi, I encountered the after installing IPEX v2.3.110 following the guide from this [link ](https://intel.github.io/intel-extension-for-pytorch/index.html#installation?platform=gpu&version=v2.3.110%2bxpu&os=windows&package=pip) system: MTL-H OS: Windows 11 Python ver: 3.11 ` python -m...