stable-diffusion-webui
stable-diffusion-webui copied to clipboard
[Script publishing request]: Publish my script
Is there an existing issue for this?
- [X] I have searched the existing issues and checked the recent builds/commits
What would your feature do ?
add my script to the extension repo.
i just realized i didnt say about it. My script will allow you use MagicPrompt AI model right inside the WebUI. It is based off another script that was found on reddit (and was taken for modification and redistributing with permision, all credits are in readme). Script's official name is "MagicPrompt". Here is the github page: https://github.com/Spaceginner/MagicPrompt-awebui. Its features are:
- Use MagicPrompt inside the WebUI
- Load MagicPrompt model in a GPU
- Change prompt every batch
- Autoinstallation of required library and model
- Pregenerate prompts
- Change behaviour of MagicPrompt model
- Preoritize your typed prompt over generated
- Beautify your prompt by typing it in prompt field and using this script
- Generate absolutely new random prompts
why you ask? why not?
Proposed workflow
- Go to my repo
- Approve
- Add this script to the extension hub
- ???
- i dont have anymore ideas what to write here
Additional information
MagicPrompt is an AI that can help with creating prompt. That is the whole description. Original model: https://huggingface.co/Gustavosta/MagicPrompt-Stable-Diffusion. Second use of this model: https://www.aiprompt.io/. Original reddit post: https://www.reddit.com/r/StableDiffusion/comments/xvjm84/magicprompt_script_for_automatic1111_gui_let_the/. plz
@Spaceginner We do have a magicprompt integrated/downloadable with the extension: dynamic prompts. I didn't know this myself until it was mentioned. See if that covers all the functionality of your script and if it doesn't let's add it.
Really? But this makes no sense, why in extension for prompt syntax there is an AI thing
So, I looked at the magicprompt script in dynamic prompts and I dont like that it uses huggingface's library for interfering which I don't like, cuz usually it just freezes my whole PC (this is why I am staying on aitextgen
library). Also, it gave me an error when I tried to use magicprompt
TypeError: MagicPromptGenerator.__init__() takes 2 positional arguments but 4 were given
So, why not to copy my code? I allow if you mention that it is my code. And also it fixes that dumb TODO comment:
# TODO this needs to be fixed
device = 0 if get_optimal_device() == "cuda" else -1
What I do is (where the to_gpu
argument):
aitextgen(model_folder="./models/MagicPrompt/", tokenizer_file="./models/MagicPrompt/tokenizer.json", to_gpu=torch.cuda.is_available())
And also if my code gets copied in dynamic prompting this will make aitextgen
installation better, not this crap:
# Try to import aitextgen, if it is not found, download
# TODO make it better
try:
from aitextgen import aitextgen
except:
print("[MagicPrompt script] aitextgen module is not found, downloading...")
if os.path.exists("./venv/Scripts/"):
subprocess.call(["./venv/Scripts/python", "-m", "pip", "-q", "--disable-pip-version-check", "--no-input", "install", "aitextgen"])
elif os.path.exists("./venv/bin/"):
subprocess.call(["./venv/bin/python", "-m", "pip", "-q", "--disable-pip-version-check", "--no-input", "install", "aitextgen"])
else:
subprocess.call(["python", "-m", "pip", "-q", "--disable-pip-version-check", "--no-input", "install", "aitextgen"])
print("[MagicPrompt script] aitextgen module is downloaded")
Really? But this makes no sense, why in extension for prompt syntax there is an AI thing
I think it does make sense - the scope of the extension is to assist in prompt generation. This includes prompts using wildcards but also enhancements such as with Magic Prompt.
Well, ok, i agree
also, since it did't freeze my pc, i think no code improvements are needed + features that are not in dynamic prompts are (kinda) useless. Also, do you delete those symbols after magicprompt is used? I don't delete them.
[:]() <some other symbols, but mainly those>