Dave
Dave
My goal is similar to #3953 - but I also want to be experiment with federated swarms with different types of workers. While I'm messing around I was thinking it...
So, I'll need to do some more experiments, but `buf` is a pretty neat tool. I've intentionally sat this on an alternative namespace (localai-v1) while I do some preliminary testing,...
~~@mudler merged in VAD, but I don't know how to test it manually yet since it's new. Please double check that :)~~
@mudler Ready for review, now with working VAD test.
merged and squashed this again. @mudler - Can you decide if this PR is worth maintaining? It's been stable through my testing for a while now. I'd like to either...
Thanks a ton! Once I finish up a handful of infra fixes, I'll take a stab at this. Now that transformers already supports different ways to LoadModel anyway... it may...
One idea that comes to mind: we should generate a gallery file on the host and automatically share that out to each worker. That way, we can leverage existing downloader...
the failing `test-linux` run was acting strange and was unable to be restarted... auto-merge will kick off a fresh cycle.
This is a good idea mudler... but it seems a _bit_ overactive. Can we tune that at all before we set it loose? Edit: I think I'm blaming the static...
https://github.com/mudler/LocalAI/blob/master/core/http/endpoints/localai/backend_monitor.go These endpoints already show which backends are loaded and allow them to be unloaded?