Ettore Di Giacinto
Ettore Di Giacinto
you should be able to install it from the gallery: ``` local-ai run nomic-embed-text-v1.5 ```
New Import flow is available on `master`: https://github.com/user-attachments/assets/01f3ed3c-d6d3-48bb-b11e-384c4299c893 Currently works with: - llama.cpp - vLLM - transformers - MLX - MLX-VLM
Should be covered by https://github.com/mudler/LocalAI/pull/3954 - we had a hacky workaround in place for workarounding https://github.com/descriptinc/audiotools/pull/111
Still present on master images. Needs deeper investigation
> Update: Using "SINGLE_ACTIVE_BACKEND" instead of "LOCALAI_SINGLE_ACTIVE_BACKEND" appears to resolve the issue. Suggesting the documentation [here](https://localai.io/advanced/#backend-flags) is outdated or flawed. `SINGLE_ACTIVE_BACKEND` and `LOCALAI_SINGLE_ACTIVE_BACKEND` are equivalent: they are toggling internally the...
Hi! Thanks for bringing this up - what's not really clear to me is: is it working for you after running apt-get upgrade in the LocalAI container?
> No neither through the containers (even after doing updates/upgrades) or building locally from source. Ends up in that same no kernel found. Ok that's a good data point! >...
Builds don't include backend anymore - however it should have pulled automatically the backend from the gallery. Please share the full logs.
@siddimore thanks for taking a stab at this, direction looks good here - just few minor nits here and there but definitely not blockers
> Some questions for maintainers: > > * Should I include these flags in the base-level Dockerfile? Since backends are now built/shipped separately, I assume the usage in the base-level...