private-gpt icon indicating copy to clipboard operation
private-gpt copied to clipboard

Number of Thread used for pirvategpt.py

Open jackfood opened this issue 1 year ago • 5 comments

What is the thread usage when running ggml? Can i add number of thread to use when running privategpt.py?

jackfood avatar May 24 '23 13:05 jackfood

yes, in the llm lines, for either LLamaCPP or GPT4All, add , n_threads=(number of threads)

TopNotchSushi avatar May 24 '23 16:05 TopNotchSushi

Thank you. managed to work.

jackfood avatar May 25 '23 00:05 jackfood

Seems like this now only works for llamacpp. I tried with 64 threads on a big server, and it worked fine. But with gpt4all it is always limited to 4 threads. And never goes over 400% usage.

haakonnessjoen avatar Jun 03 '23 14:06 haakonnessjoen

n_threads=(number of threads)

same here. llm.n_threads = 1 or 8 or else makes no difference to the thread count.

abalib avatar Jun 04 '23 16:06 abalib

Update to the latest version of langchain (0.193) and the n_threads works for GPT4all.

JasonMaggard avatar Jun 07 '23 18:06 JasonMaggard