ipex-llm
ipex-llm copied to clipboard
vLLM offline_inference.py failed to run on CPU inference
Failed to run python offline_inference.py from link for vLLM offline inference on CPU. It seems that llm.py has been removed in the previous version.
Hi, the vLLM CPU backend is removed for now. The support will be added back later. Sorry for the inconvenience.