风暴

Results 5 issues of 风暴

When I conducted experiments using Baichuan 2-13B Chat, I found that the CPU utilization rate of each core was only 20%. I believe it is because BigDL LM did not...

user issue

### Describe the issue ![image](https://github.com/intel/intel-extension-for-pytorch/assets/57557769/5bc5cd5f-4433-479d-9bd2-77144afa6f54) hi, my machine is like this, I want to run benchmark, which command can get the best performance? The order I tried before was this,...

### Describe the issue May I ask if this library supports Qwen-14B Chat? If not supported, are there any adaptation plans in the future?

Query
LLM

When i use python_api_example or streaming_llm python scripts to inference Qwen-14B-Chat,the first two questions were outputted normally, but the third question has been repeating itself since then. I find it...

### Is there an existing issue for this? - [X] I have searched the existing issues ### Environment ```markdown - Milvus version:v2.3.0-1977-g4ae7cabb0-dev - Deployment mode(standalone or cluster):standalone - MQ type(rocksmq,...

kind/bug
triage/needs-information