CogVLM
CogVLM copied to clipboard
cogagent 24GB gpu memory is not enough
You need 36GB to run fp16 version models and 24GB is only enough for 4bits model
You need 36GB to run fp16 version models and 24GB is only enough for 4bits model
Traceback (most recent call last):
File "/home/spider/slj/project/wangjingli/CogVLM/basic_demo/cli_demo_hf.py", line 79, in inf, nan or element < 0
when i run "python cli_demo_hf.py --quant 4 --fp16" i got this wrong. why is this happened?