Steve Grubb
Steve Grubb
Thanks for pointing to a source. I have not merged that code upstream because there wasn't clear consensus that the patch series was going to ever be accepted. I think...
Closing this. Thanks for the report.
It is possible to create the trust file on another system and then drop it in the trust.d directory. If we make any loopholes for fapolicyd-cli, it becomes a possible...
I was thinking you'd make that trust file as part of the deployment of the custom software. A second option would be to package the custom software so that it's...
Been a while on this. If you don't want the audit event on allowing access, just make the decision "allow" no audit. Also, a rule could be created to allow...
Checked our rules, the allow access to all grants access. I'd suggest delivering a trust file as part of the software installation. I don't there's much else we can do....
I've been taking notes on how to run vllm. I have to say that it has changed since I started working on this. With PR #384 in place, it should...
Yes, there are gotchas. Running vllm itself remotely is straightforward. The API server image has a lot of supporting commands that are needed for networking and mounting things. Also, routing...
I opened issue #405. We still have model name mismatches.
OK, to follow up, I updated to 0.0.53, rebuilt the container, and used the run-with-safety.yaml file and it all works. Finally! What is need to be done? There is https://github.com/meta-llama/llama-stack/blob/main/llama_stack/templates/remote-vllm/doc_template.md...