Imaginer icon indicating copy to clipboard operation
Imaginer copied to clipboard

Local Image Generation

Open issacdowling opened this issue 2 years ago • 10 comments
trafficstars

Is your feature request related to a problem? Please describe. I wish I could generate images without needing an internet connection or a separate service.

Describe the solution you'd like I'd like to be able to run local models (since Stable Diffusion is open source anyway) using this app.

Describe alternatives you've considered There are Web UIs (like InvokeAI), but I much prefer this native desktop app to something that runs in a browser.

Additional context It would be nice if I could use my GPU to locally generate images, instead of relying on external services.

issacdowling avatar May 13 '23 16:05 issacdowling

Yes, that's on the roadmap (i started working on it) and for the same feature on Bavarder

0xMRTT avatar May 13 '23 17:05 0xMRTT

Cool, I was just about to make the same issue on Bavarder. Right now I'm relying on distrobox to give me a separate environment that's good for these tasks, so having a flatpak app that does the same would be really nice.

I have 2 more questions then:

  1. How far along is this?
  2. Are there also plans with local models to let you change resolution, iterations, etc?

issacdowling avatar May 13 '23 17:05 issacdowling

Since the App has been designed to be expandable and is based on a plugin system, I can add as many preferences for each provider as I want. The issue with local models is that I need to add a way to download it

0xMRTT avatar May 13 '23 17:05 0xMRTT

@issacdowling Local Models will be available in the next release of Bavarder and will be available in Imaginer soon.

You can see the documentation here

0xMRTT avatar May 21 '23 20:05 0xMRTT

Is there any chance that these projects would ever get the ability to actually run these models within the flatpak? As in, bundling something like llama.cpp for Bavarder, and whatever the tool is for stable diffusion for imaginer, rather than connecting to an API for something else running locally?

Fair if not, as I can see how it would add lots of complexity, but it would mean getting set up with real local models would be way cleaner

issacdowling avatar May 22 '23 18:05 issacdowling

The issue is that if I bundle for example llama.cpp inside the flatpak, the flatpak will be way bigger for a feature that not everyone use...

0xMRTT avatar May 22 '23 18:05 0xMRTT

Speaking of that, what does it connect to now? A company that mines data?

tio-trom avatar Jul 13 '23 00:07 tio-trom

Hugging Face

0xMRTT avatar Jul 13 '23 00:07 0xMRTT

Hugging Face

Thanks! I wonder if they collect data and what do they collect via this Imaniger app.

tio-trom avatar Jul 13 '23 00:07 tio-trom

Hugging Face

Thanks! I wonder if they collect data and what do they collect via this Imaniger app.

https://huggingface.co/privacy

0xMRTT avatar Jul 13 '23 00:07 0xMRTT