stable-diffusion-webui-forge
stable-diffusion-webui-forge copied to clipboard
Prompt word wrap length limit (in tokens - for texts shorter than specified, if they don't fit into 75 token limit, move them to the next 75 token chunk) 255?
Sorry my ignorance but if flux can do 255 tokens in a chunk, this prompt word wrap lenght limit in the settings feels like a downgrade to flux capacities, I tried to change it in some config but I can´t find it, somebody could tell me where to change this to a better understandig of this chunks in flux if this really matters? maybe its in the code.
Its just a visual thing. I don't think the value can be tweaked for the UI, but behind the scenes, it works with up to do 255 tokens:
@torch.inference_mode()
def get_prompt_lengths_on_ui(self, prompt):
token_count = len(self.text_processing_engine_t5.tokenize([prompt])[0])
return token_count, max(255, token_count) # <-----