lollms-webui
lollms-webui copied to clipboard
Latest version is unusable with ExLlama
When using the latest version every model i try to load with exllama il get either an error or just bad results (nonsense)
It used to work great at prior versions.
Steps to Reproduce
Using latest version from scratch
- Step 1 - install exllama v2 binding
- Step 2 - choose any model
- Step 3 - try go ask a simple question
Windows 11 , 3090Ti
Screenshots
Just fresh installed using v6.7 bat file installer, same results
Managed to fix the issue by using the latest bat file made 20 hours ago (not in release page) Yi-34B and Deepseek coder models are not working, GPTQ seems fine
Yes because they require me to enable the execution of remote code. That's too risky of an operation. I prefer waiting that hugging face add this to their library. Executing code from a model is a security risk as someone may create a virus and hide it inside a model.
If you trust them enough, just add trust_remote_code=True to the from_pretrained when creating the tokenizer if you want to override this. It should work.