mlc-llm icon indicating copy to clipboard operation
mlc-llm copied to clipboard

Error with LLVM Configuration on Windows for GPU Inference in mlc-llm

Open sreejith-ios opened this issue 1 year ago • 8 comments

I am encountering issues while trying to access the GPU for LLM inferencing with mlc-llm on Windows.

To Reproduce

  1. Followed the official documentation for mlc-llm.
  2. Set up my environment and installed LLVM version 19.1.0.
  3. Tried the CLI mode and Python API to run on Windows.
  4. Successfully performed inference using mlc-llm and Python script on the CPU.
  5. Faced issues while trying to access the GPU for LLM inferencing.

Error Messages

Error: Using LLVM 19.1.1 with -mcpu=apple-latest is not valid in -mtriple=arm64-apple-macos, using default -mcpu=generic.

Expected behavior

I expected to access the GPU for LLM inferencing without encountering configuration-related errors.

Environment

  • Platform: Vulkan
  • Operating system: Windows
  • Device: Intel Arc GPU
  • How you installed MLC-LLM: conda
  • How you installed TVM-Unity: pip
  • Python version: 3.9
  • GPU driver version: Intel Arc No subject Screenshot Oct 9 Oct 9 Screenshot from Gmail 31.0.101.5449
  • CUDA/cuDNN version: Not applicable
  • TVM Unity Hash Tag: dc87019cb805d0a1f0075f6415cc979ef337ec2a
  • LLVM version: 19.1.1

Additional context

I have verified that llvm-config.exe is accessible and the version shows correctly. Despite setting the target platform explicitly to Vulkan and ensuring all packages and dependencies are updated, I still encounter this issue when trying to access the GPU.


sreejith-ios avatar Oct 09 '24 09:10 sreejith-ios

Hello, documentation for linux https://llm.mlc.ai/docs/install/mlc_llm.html#install-mlc-packages Contain recomendation using python 3.11. Does it helps?

BlindDeveloper avatar Oct 10 '24 05:10 BlindDeveloper

Hi @BlindDeveloper I am using a intel arc gpu on windows 10 machine.

sreejith-ios avatar Oct 10 '24 05:10 sreejith-ios

Hi @BlindDeveloper I have official documentation from mlc-llm.

sreejith-ios avatar Oct 10 '24 05:10 sreejith-ios

@sreejith-ios If you try lounch mlc llm on your windows computer using python 3.11 The ,bug is still present?

BlindDeveloper avatar Oct 10 '24 06:10 BlindDeveloper

Error: Using LLVM 19.1.1 with -mcpu=apple-latest is not valid in -mtriple=arm64-apple-macos, using default -mcpu=generic.

I wonder why you are using apple and macos as mcpu and mtriple.

What's your command to compile or run the model

Hzfengsy avatar Oct 10 '24 12:10 Hzfengsy

I am using mlc_llm chat MODEL [--model-lib PATH-TO-MODEL-LIB] command from official documentation of mlc-llm for llm inference after converting HF downloaded model to mlc-format using https://llm.mlc.ai/docs/compilation/convert_weights.html

https://llm.mlc.ai/docs/deploy/cli.html#id2 I get the above error on my CLI which is I copied on the ticket.

sreejith-ios avatar Oct 10 '24 13:10 sreejith-ios

Starting from LLVM 19, the mcpu target for Apple chips is called apple-m4 instead of apple-latest: https://www.phoronix.com/news/Apple-M4-Added-To-LLVM-Clang

Evan-Zhao avatar Dec 19 '24 17:12 Evan-Zhao

as per your error logs, here is the resolution step for another similar issue - #3177_ you may try that.

aakashv000 avatar Mar 30 '25 02:03 aakashv000