Results 295 comments of Steve Grubb

All demos (llama-stack-apps) document that the default port of the API server is 5000. If you use 5001, it deviates from the documentation. I see parts of the docs now...

Digging a little more into the resulting container...I ran ```llama stack build --template remote-vllm --image-type docker``` The resulting image has a copy of vLLM inside it. Does it need one?...

I see the reported demo failures are fixed. Thanks! I also see that vLLM as a container dependency has been dropped. And I also see that torch is needed for...

This is in the current plans for the next release.

PR #383 should address this.

Closing this since 1.4.2 was released with this fix.

How does the process gain group 106248123? Is it either an effective, saved‑set, or filesystem GID? Or is this in /etc/group where test_pda2 is on the same line that defines...

I guess, specifically, what do you get for: cat /proc/self/status | grep Groups when you are setup the same as when ~/bin/ls was run?

Version 1.3.7 was released earlier today. It now collects more group information and possibly solves the reported issue. Please give it a try to see if we can close this...

Is there a man page for this syscall? Enabling the syscall number may not be sufficient. We may need to add interpretations depending on the syscall.