InvokeAI icon indicating copy to clipboard operation
InvokeAI copied to clipboard

[enhancement]: Feedback on prompt length

Open whosawhatsis opened this issue 2 years ago • 2 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues

Contact Details

No response

What should this feature add?

I've been writing longer and longer prompts, and sometimes I'll keep trying to increase the weight of a term with no effect, only to find that I have to move it higher in the prompt to make it work. Clearly, I'm going over the token limit. It would be nice if this limit could be increased like in automatic1111, but failing that (or in addition), it would be nice to have some feedback about the length of the prompt so that we can see how close we are to the limit (or how far over, in my case).

Alternatives

Of course, just increasing the limit would be great, but it would still be nice to have a token counter.

Aditional Content

No response

whosawhatsis avatar Nov 30 '22 05:11 whosawhatsis

This might be useful: https://github.com/openai/tiktoken

whosawhatsis avatar Dec 16 '22 21:12 whosawhatsis

The token limit is hard coded to 77 in ldm/modules/encoders/modules.py and a few other places.

I'm not sure why though. Perhaps @lstein can shed some light on this.

dsully avatar Feb 05 '23 05:02 dsully

I'm not sure why though. Perhaps @lstein can shed some light on this.

It's a stable diffusion limit. Other projects have some hacks that kinda get around it, but it turns out that what they're doing is basically the same thing that invoke does with blends, just with less control. As for feedback about token counts, I just submitted a PR to help with that: https://github.com/invoke-ai/InvokeAI/pull/2523

whosawhatsis avatar Feb 05 '23 06:02 whosawhatsis

Gotcha, and I was just reading #1541 as well. The problem of course is that at multiple levels prompt structure isn't compatible. So some sort of translator (to use blends as well?) would be necessary.

At the very least, 77 should be assigned to a constant in the code. 😄

dsully avatar Feb 05 '23 16:02 dsully

I just spent a few hours wondering why my invoke-ai results end up in a wrong location until i moved the location keyword up. 🤦‍♂️

Very frustrating experience.

GammelSami avatar Apr 06 '23 16:04 GammelSami

So then what do you think should be done here (vote with emoji)?

🇦 Give a simple indicator (e.g. red) when a prompt exceeds 77 tokens 🇧 Show a pop-up detailing why a prompt is invalid? 🇨 Something else?

src-r-r avatar Jan 18 '24 19:01 src-r-r

Closing this as prompts should be allowed to exceed 77 tokens - However, there could potentially be some visual indicator (hard problem) to inform user of where breaks are added.

hipsterusername avatar Feb 21 '24 15:02 hipsterusername