LocalAI icon indicating copy to clipboard operation
LocalAI copied to clipboard

feat: import models via URI

Open mudler opened this issue 1 month ago • 1 comments

Description

This PR is a starting point for #7114

It adds an import section by URL:

Screenshot 2025-11-11 at 22-33-21 LocalAI - Import Model

Currently covers:

  • full URL files to gguf (llama-cpp), limited to single files-model (for now)
  • MLX

Example specify https://huggingface.co/Qwen/Qwen3-0.6B-GGUF/blob/main/Qwen3-0.6B-Q8_0.gguf will automatically install and configure the model with llama.cpp.

It currently abstracts away to "Importers" and prepares scaffolding (moves the HF api client) to pkg as we would need to use it to parse the repository if one is provided

Todo:

  • [ ] Add support for other providers (vLLM, etc)
  • [ ] Add support for selection of GGUF file if adding the whole repository
  • [ ] Add support for multiple files with llama.cpp (models split in different gguf files)
  • [ ] Add support for GGUF with mmproj files (VL models)
  • [ ] Support OCI/ollama

Notes for Reviewers

Signed commits

  • [ ] Yes, I signed my commits.

mudler avatar Nov 11 '25 17:11 mudler

Deploy Preview for localai ready!

Name Link
Latest commit 9366052eb955f7f12446eda762f638a2eaee61d3
Latest deploy log https://app.netlify.com/projects/localai/deploys/6914d0f79973f40008445c7e
Deploy Preview https://deploy-preview-7245--localai.netlify.app
Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

netlify[bot] avatar Nov 11 '25 17:11 netlify[bot]