[importers] Add diffuser backend importer with ginkgo tests and UI support
Description
Adds importer for diffuser backend models following existing importer patterns (llama-cpp, mlx, vllm, transformers), including UI support in the model editor.
Implementation
-
diffuser.go: ImplementsDiffuserImportermatching models via:-
backend: "diffusers"preference - HuggingFace files:
model_index.json,scheduler/scheduler_config.json
-
-
diffuser_test.go: 18 ginkgo tests covering match/import logic and preferences -
importers.go: RegisteredDiffuserImporterindefaultImporters -
model-editor.html: Added diffuser preferences to model editor UI:- Backend dropdown now includes "diffusers" option
- Dynamic fields appear when diffusers backend is selected: Pipeline Type, Scheduler Type, Enable Parameters, CUDA checkbox
- All fields integrate with existing preferences system
Configuration Output
backend: diffusers
known_usecases:
- image
diffusers:
cuda: true
pipeline_type: StableDiffusion3Pipeline
scheduler_type: k_dpmpp_2m
enable_parameters: negative_prompt,num_inference_steps
parameters:
model: https://huggingface.co/stabilityai/stable-diffusion-3-medium
Supports preference overrides: pipeline_type, scheduler_type, cuda, enable_parameters. Defaults to StableDiffusionPipeline with standard parameters.
UI Changes
The model editor now provides a user-friendly interface for configuring diffuser models with backend-specific options that appear dynamically when "diffusers" is selected.
Notes for Reviewers
18 new tests added (88 total). 3 pre-existing test failures in LlamaCPPImporter unrelated to changes.
Signed commits
- [x] Yes, I signed my commits.
Original prompt
Look at the already existing importers implementation and add one for the diffuser backend. Add also ginkgo tests.
💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.
Deploy Preview for localai ready!
| Name | Link |
|---|---|
| Latest commit | 9acb271ea2a15a8d972ba3e9549e41e3354d3d2d |
| Latest deploy log | https://app.netlify.com/projects/localai/deploys/691f27f946abbf0008fe9c84 |
| Deploy Preview | https://deploy-preview-7316--localai.netlify.app |
| Preview on mobile | Toggle QR Code...Use your smartphone camera to open QR code link. |
To edit notification comments on pull requests, go to your Netlify project configuration.
@copilot add the new preferences to the common preferences of the model-editor import section
@copilot add the new preferences to the common preferences of the model-editor import section
Added diffuser preferences to the model-editor import section in commit d6464fa.
The following fields now appear when "diffusers" backend is selected:
- Pipeline Type (pipeline_type)
- Scheduler Type (scheduler_type)
- Enable Parameters (enable_parameters)
- CUDA checkbox (cuda)
All fields integrate with the existing preferences system and are passed to the DiffuserImporter.
@copilot be consistent and use yaml/v3
@copilot be consistent and use yaml/v3
Updated to use gopkg.in/yaml.v3 for consistency in commit 9acb271.
[!WARNING]
Firewall rules blocked me from connecting to one or more addresses (expand for details)
I tried to connect to the following addresses, but was blocked by firewall rules:
huggingface.co
- Triggering command:
/tmp/go-build2054339694/b001/importers.test -test.paniconexit0 -test.timeout=10m0s -test.v=true(dns block)If you need me to access, download, or install something from one of these locations, you can either:
- Configure Actions setup steps to set up my environment, which run before the firewall is enabled
- Add the appropriate URLs or hosts to the custom allowlist in this repository's Copilot coding agent settings (admins only)