Gabor Kukucska

Results 8 comments of Gabor Kukucska

> It could be that you're connecting to a different ollama instance when you run directly if `OLLAMA_HOST` isn't set for your environment. > > Try this: `OLLAMA_HOST=0.0.0.0:63321 ollama pull...

+1 for ollama support and documentation. I get `ModuleNotFoundError: No module named 'llama_index.embeddings.ollama'` when I use `LLM_EMBEDDING_MODEL="llama2"` and `ModuleNotFoundError: No module named 'llama_index.embeddings.huggingface'` when I use `LLM_EMBEDDING_MODEL="local"`

Additionally, it would be awesome if it could also load across networked GPUs like Petals do. This would allow communities with older GPUs to combine their vram... especially locally where...

Actually, there were many errors in the newest version but when I rolled back to 210526 that has the above error.

> I believe this is a PHP8 incompatibility problem that affects the current and prior versions. The following link elaborates and suggests a fix by commenting out the problematic line...

I (complete noob here) think this needs a model fine tuned to know how to control a computer solely using keyboard shortcuts. Everything should be achievable that way too. No?...

> I'm going to go ahead and close this as answered. @gaborkukucska you just need the `OLLAMA_HOST` variable set correctly for both the client and the server. I hope you...

Hello :) Any chance to get pydantic 2+ working with Hivemind any time soon? EDIT: pydantic < 2 is breaking other packages like LiteLLM I'm trying to use Petals with.