awesome-python
awesome-python copied to clipboard
Add Haystack to README.md
What is this Python project?
Open-source NLP framework for building industrial strength applications on top of Transformer and LLM AI models.
What's the difference between this Python project and similar ones?
- Supports a wide range of models
- Supports a wide range of databases and vector stores
- Easily customisable and extendable, NLP pipelines can be easily composed into larger applications
--
Anyone who agrees with this pull request could submit an Approve review to it.
ControlNet looks pretty compelling. I've only just got image to image up and running in my alpha app, and I'm not a python programmer, but if a few people are willing to take on the challenge I would be cheering them on!
Agreed. Way more confident in my swift skills, but that said, I could probably stumble through python enough to help out, just not even sure where to start for something like ControlNET.
+1!
+1
+1
+1
I was able to get ControlNet working with CoreML on my app Guernika so this is definitely possible, so far I have added preprocessing for poses, depth and HED maps but I have successfully converted all eight existing models.
This does require modifying Unet's inputs so all existing models will not be compatible and will need to be reconverted.


Great work @GuiyeC ! I just dowloaded and had a play. You've done an excellent job on the UI too. I'd encourage you to submit a pull request to add control net support to the repo if you are willing.
Wow! @GuiyeC I commented on the other thread before i saw this. Looks amazing. Seems like you modified the underlying UNET via python, then used that in the conversion process of other models? +1 on a merge if you have time!
@gavtron2000 @pj4533 thank you for your comments!
I have the python scripts in my Guernika repo, you can see how the Unet and ControlNet are implemented for conversion, probably not the cleanest way of passing residuals to the Unet but that's how I managed to do it.
As for the PR, I don't think I'll ever make a PR to this repo as my repo is quite different already, for example, I implemented img2img first and the implementation is different, if I'm not wrong random latents on this repo are generated in python and I do that in Swift, my encoder is just the encoder module. I also implemented instruction pix2pix support, ControlNet and other tweaks here and there, so it would practically mean taking over this repo. At most I would make my fork public but I would like to have some time to give my app an edge and get Guernika going.
☹️
Totally understandable @GuiyeC. Really impressive work, and a really excellent app. I wish you every success.
Good job @GuiyeC! Do you have a Discord server? I have some questions/feedback
@Zabriskije you can contact me on Twitter or Telegram, same name, or you can leave them in Hugging Face or in Reddit :)
I tried adding ControlNet in this PR! https://github.com/apple/ml-stable-diffusion/pull/153
@ryu38 The controlNet support is awesome, I just implemented it in my app. However, one quick question.
You provide the models for ControlNet in the pipeline initialization, but the input images for controlNet in the pipeline configuration. Is there a way to have the pipeline initialized to support a given controlNet model (meaning I include the name during initialization), but then when I process a given frame, I don't include an input frame for controlNet so it doesn't use controlNet?
Right now, it seems like it is all or nothing. If I have a controlNet model specified in the pipeline initialization, I get an out of range error if I send an empty array to controlNetInputs
.
My use case is for video, and I'd like to use controlNet for some frames, but not others, without having to re-init the pipeline.
Right now, it seems like it is all or nothing. If I have a controlNet model specified in the pipeline initialization, I get an out of range error if I send an empty array to
controlNetInputs
.
Take a look at this thread: https://github.com/apple/ml-stable-diffusion/issues/197
In particular: "Basically you just need to modify the additionalResiduals (the MLShapedArray) generated from the ControlNet, before passing the value to the ControlledUNet. By the way I got error when putting nil
to the values. So modify it instead."