Daniyar
Daniyar
This problem still persists. When I install it via ComfyUI manager restarting it hangs it. #205
Also `9763` port throws timeout
@lizijian0630 I agree with you. There are big problems with docs. If you look [here](https://github.com/guidance-ai/guidance/blob/main/guidance/models/_lite_llm.py) you can find that it can use LiteLLM, which itself can work with Ollama or...
@riedgar-ms I've looked at this code and it rises even more questions. Is that only possible to make server by using only this libs built in functionality? Is it possible...
The problem still persists, I hope they will fix it soon.