InvokeAI
InvokeAI copied to clipboard
[enhancement]: Feedback on prompt length
Is there an existing issue for this?
- [X] I have searched the existing issues
Contact Details
No response
What should this feature add?
I've been writing longer and longer prompts, and sometimes I'll keep trying to increase the weight of a term with no effect, only to find that I have to move it higher in the prompt to make it work. Clearly, I'm going over the token limit. It would be nice if this limit could be increased like in automatic1111, but failing that (or in addition), it would be nice to have some feedback about the length of the prompt so that we can see how close we are to the limit (or how far over, in my case).
Alternatives
Of course, just increasing the limit would be great, but it would still be nice to have a token counter.
Aditional Content
No response
This might be useful: https://github.com/openai/tiktoken
The token limit is hard coded to 77 in ldm/modules/encoders/modules.py and a few other places.
I'm not sure why though. Perhaps @lstein can shed some light on this.
I'm not sure why though. Perhaps @lstein can shed some light on this.
It's a stable diffusion limit. Other projects have some hacks that kinda get around it, but it turns out that what they're doing is basically the same thing that invoke does with blends, just with less control. As for feedback about token counts, I just submitted a PR to help with that: https://github.com/invoke-ai/InvokeAI/pull/2523
Gotcha, and I was just reading #1541 as well. The problem of course is that at multiple levels prompt structure isn't compatible. So some sort of translator (to use blends as well?) would be necessary.
At the very least, 77 should be assigned to a constant in the code. 😄
I just spent a few hours wondering why my invoke-ai results end up in a wrong location until i moved the location keyword up. 🤦♂️
Very frustrating experience.
So then what do you think should be done here (vote with emoji)?
🇦 Give a simple indicator (e.g. red) when a prompt exceeds 77 tokens 🇧 Show a pop-up detailing why a prompt is invalid? 🇨 Something else?
Closing this as prompts should be allowed to exceed 77 tokens - However, there could potentially be some visual indicator (hard problem) to inform user of where breaks are added.