lollms-webui
lollms-webui copied to clipboard
Exits without feedback / Cannot load _pyllamacpp
Current Behavior
If I run the run.bat
, I get nothing but the logo. By inspecting the run.bat
-file, I saw i was in the pause > nul
and the execution of the app.py
had already finished.
By running app.py
manually, it also just exited without doing anything.
By proffesional print
-debugging (irony), I was able to trace the error back to be in the _pyllamacpp.cp311-win_amd64.pyd
, as you can see from the attached screenshots.
Context
- Windows 10 / build 10.0.19045
- x64-Architecture
- RAM is either 8GB or 16GB; I could not find it, but it is probably 16GB.
- Tell me to add anything required.
Screenshots
Modified code:
Result:
What is your CPU?
If it does not support AVX2 instruction set then youre out of luck
Having a similar problem here, mine have proper output:
Traceback (most recent call last):
File "D:\Users\x\Documents\gpt4all-ui\app.py", line 33, in <module>
from pyGpt4All.api import GPT4AllAPI
File "D:\Users\x\Documents\gpt4all-ui\pyGpt4All\api.py", line 14, in <module>
from pyllamacpp.model import Model
File "D:\Users\x\Documents\gpt4all-ui\env\lib\site-packages\pyllamacpp\model.py", line 21, in <module>
import _pyllamacpp as pp
ModuleNotFoundError: No module named '_pyllamacpp'