prompt-optimizer icon indicating copy to clipboard operation
prompt-optimizer copied to clipboard

Entropy optim uses BERT - how to address situations where token length gets above 512?

Open mriganktiwari opened this issue 2 years ago • 1 comments

max_position_embeddings = 512 for most encoding models I see, how can we address optimising prompts on length > 512 tokens?

mriganktiwari avatar Aug 25 '23 14:08 mriganktiwari

hmm I should've seen this coming. Breaking the input into chunks of max_len and running entropy optim on all them separately and then combining them is the way to go.

vaibkumr avatar Aug 27 '23 00:08 vaibkumr