Ye Ting

Results 33 comments of Ye Ting

@0Pinky0 could you please verify if IPEX v2.3.110 hotfix patch resolves your issue? please refer to the installation guide: https://intel.github.io/intel-extension-for-pytorch/index.html#installation?platform=gpu&version=v2.3.110%2bxpu&os=windows&package=pip. # For Intel® Arc™ A-Series Graphics, use the commands below:...

IPEX 2.3.110 windows wheels for ARC770 are publicly available, please refer to https://intel.github.io/intel-extension-for-pytorch/index.html#installation?platform=gpu&version=v2.3.110%2bxpu&os=windows&package=pip

@CleopatraGreuel It is recommended using latest PyTorch 2.8 for Meteor Lake support. Please refer to https://www.intel.com/content/www/us/en/developer/articles/tool/pytorch-prerequisites-for-intel-gpu/2-8.html. Note that we have deprioritized active development of Intel® Extension for PyTorch* on GPU...

@Nehereus Since you are able to reproduce this issue only with PyTorch 2.9.1, it is not related to IPEX. Could you please open an issue in https://github.com/intel/torch-xpu-ops/ for awareness?

Which version of transformers is used here?

It seems some dependent libraries from oneAPI are not found. How did you install IPEX v2.8.10+xpu? Did you build from source or install the prebuilt binaries, according to https://pytorch-extension.intel.com/installation?platform=gpu&version=v2.8.10%2Bxpu&os=windows&package=pip?

Please refer to the installation guide for details: https://pytorch-extension.intel.com/installation?platform=gpu&version=v2.8.10%2Bxpu&os=linux%2Fwsl2&package=pip When you are using prebuilt wheels, please don't source oneAPI env variables, the dependent libraries will be installed automatically when you...

A770 generally supports BF16. Typically, we recommend using BF16 for training and FP16 for inference workloads. If you encounter any accuracy issues with specific operators on A770, we suggest testing...

@danielmayost I don't have the answer. Please ask directly at https://github.com/intel/ipex-llm.

> Can a new version of [portable Ollama](https://github.com/intel/ipex-llm/blob/main/docs/mddocs/Quickstart/ollama_portable_zip_quickstart.md) be provided using either latest ipex or Pytorch 2.8 release? Instructions for using the XPU device with Ollama will be helpful. I...