intel-extension-for-transformers icon indicating copy to clipboard operation
intel-extension-for-transformers copied to clipboard

How to config then let the Neural Chat work for chatglm3-6b?

Open ahlwjnj opened this issue 1 year ago • 0 comments
trafficstars

I have succeeded to run ./chatglm3-6b by Intel-extension-for-transformers on my laptop, but I try to use Nueral Chat to run the same model(./chatglm3-6b) then fail: " Process finished with exit code 137 (interrapted by signal 9:SIGKILL)"

Code: from intel_extension_for_transformers.neural_chat import build_chatbot, PipelineConfig from intel_extension_for_transformers.transformers import RtnConfig config = PipelineConfig( model_name_or_path='./chatglm3-6b', optimization_config=RtnConfig( bits=4, compute_dtype="int8", weight_dtype="int4_fullrange" ) ) chatbot = build_chatbot(config) response = chatbot.predict(query="Hi")

CPU: I7-13700H Memory: 16G Ubuntu 22.04

Q1: How to config then let the Neural Chat work for chatglm3-6b ?

Q2: How to realize Server API based on the Q4 version model bin file (ne_chatglm2_q_nf4_bestla_cfp32_g32.bin) ?

Thanks a lot.

ahlwjnj avatar Jul 09 '24 14:07 ahlwjnj