MikoAL
MikoAL
do mklink "llama/main" C:\Users\User\Desktop\Projects\llama\llama.cpp\build\bin\Release\main.exe in the cmd remember to create the folder "llama" and copy the file "main.exe" into it. NOTE: I am a confused idiot and this may be...
Have you tried setting the seed to the same thing? The seed you set in the "Reproduction" section is marked as -1, which means random seed. That could be why?...
it might be a python version problem? I got the same error while using 3.9 but it worked when using 3.10
same error, need help
UPDATE: I have no clue why, but when I run this on the terminal of vs code, it works if I press crtl + f5, but not when I use...
for me it didn't fix the issue? I have changed the MODEL_PATH to this `MODEL_PATH=models/ggml-gpt4all-j-v1.3-groovy.bin ` and I have created a models folder and put the file "ggml-gpt4all-j-v1.3-groovy.bin " in....
somehow, I kinda brute forced a way to make it work `llm = GPT4All(model=r"C:\this\is\a\path\privateGPT\models\ggml-gpt4all-j-v1.3-groovy.bin", n_ctx=1000, backend='gptj', callbacks=callbacks, verbose=False)`