ipex-llm icon indicating copy to clipboard operation
ipex-llm copied to clipboard

Unable to save quantized model

Open vmadananth opened this issue 2 months ago • 1 comments

I m trying to save a int4 quantized model. When i try to save it , i get this error when trying to solve the issue. Traceback (most recent call last): File "C:\Users\AI-Perf\Varsha\ipex-llm\python\llm\example\GPU\HF-Transformers-AutoModels\Save-Load\generate.py", line 58, in model.save_low_bit(save_path) File "C:\Users\AI-Perf.conda\envs\ipex-llm\Lib\site-packages\ipex_llm\transformers\model.py", line 62, in save_low_bit delattr(self.config, "_pre_quantization_dtype") AttributeError: 'LlamaConfig' generate_profile_thebloke.txt object has no attribute '_pre_quantization_dtype'

Attached is my code.

vmadananth avatar May 16 '24 15:05 vmadananth